CN115545959A - Skateboard system - Google Patents

Skateboard system Download PDF

Info

Publication number
CN115545959A
CN115545959A CN202210766948.2A CN202210766948A CN115545959A CN 115545959 A CN115545959 A CN 115545959A CN 202210766948 A CN202210766948 A CN 202210766948A CN 115545959 A CN115545959 A CN 115545959A
Authority
CN
China
Prior art keywords
user
performance
computer
data
activity
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210766948.2A
Other languages
Chinese (zh)
Inventor
J.阿格纽
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nike Innovate CV USA
Original Assignee
Nike Innovate CV USA
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nike Innovate CV USA filed Critical Nike Innovate CV USA
Publication of CN115545959A publication Critical patent/CN115545959A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/40ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the management of medical equipment or devices, e.g. scheduling maintenance or upgrades
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • General Business, Economics & Management (AREA)
  • Epidemiology (AREA)
  • Business, Economics & Management (AREA)
  • Medical Informatics (AREA)
  • Primary Health Care (AREA)
  • Public Health (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biophysics (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Example embodiments may be directed to a system, method, apparatus, and computer-readable medium configured to monitor a user performing various athletic movements and generate performance characteristics based on data corresponding to such athletic movements. Users may also be encouraged to participate in athletic challenges or competitions against other users or groups of users. In addition, athletic activity data for multiple people may be collected at a central location and subsequently displayed to the user at a desired remote location, enabling the user to compare his/her athletic activity with others.

Description

Skateboard system
The application is a divisional application of an invention patent application with the application date of 2014, 5 and 30, and the application number of 201480043593.8, and the invention name of the invention is "skateboard system".
CROSS-REFERENCE TO RELATED APPLICATIONS
This application claims benefit and priority from U.S. patent application No. 61/829,809, filed on 31/5/2013, and claim benefit and priority from U.S. patent application No. 61/874,248, filed on 5/9/2013. This application is also a continuation-in-part co-pending PCT application No. PCT/US14/27519, entitled "athleteric ATTRIBUTE detection FROM IMAGE DATA," filed on 14.3.2014, which claims the benefit and priority of U.S. patent application No. 61/783,328 filed on 5.9.2013. The contents of the above-referenced applications are incorporated by reference herein in their entirety and as part of this document.
Technical Field
The invention relates to the collection and display of athletic information. Aspects of the present invention are particularly well suited for collecting motion information over a network and displaying the collected information.
Background
Exercise and fitness have become increasingly popular, and the benefits of such activities are well known. Various types of technology have been incorporated into fitness and other athletic activities. For example, a wide variety of portable electronic devices may be used in fitness activities, such as MP3 or other audio players, radios, portable televisions, DVD players, or other video playback devices, watches, GPS systems, pedometers, mobile telephones, pagers, BP sets, and the like. Many fitness enthusiasts or athletes use one or more of these devices while exercising or training to maintain their pleasure, provide performance data, or maintain their contact with others, among other things. Such users may also be interested in recording their athletic activity and the metrics associated therewith. Accordingly, various sensors may be used to detect, store, and/or transmit athletic performance information. However, in general, athletic performance information is presented off-reality or based on overall athletic activity. Exercisers may be interested in obtaining additional information about their workouts.
Disclosure of Invention
The following presents a general summary of example aspects in order to provide a basic understanding of example embodiments. This summary is not an extensive overview. It is not intended to identify key/critical elements or to delineate the scope of the invention. The following summary merely presents some concepts of the invention in a general form as a prelude to the more detailed description provided below.
One or more aspects describe systems, apparatuses, computer-readable media, and methods of using geographic information related to athletic activities. Sensors may be attached to the user and/or clothing to generate performance data. The sensors may include accelerometers, pressure sensors, gyroscopes, and other sensors that can transform physical activity into electrical signals. The data, along with the location data, may be transmitted to a server. The server may maintain a user leaderboard and location and allow the user to search for other users and locations of the athletic activity. In some aspects of the invention, a user interacts with a server using a mobile device, such as a mobile phone.
In some aspects, systems, devices, computer-readable media, and methods may be configured to process an input specifying a user attribute, adjust a performance zone based on the user attribute, receive data generated by at least one of an accelerometer and a force sensor, determine whether the data is within the performance zone, and output the determination.
In some aspects, systems, devices, computer-readable media, and methods may include receiving data generated by sensors (e.g., accelerometers, force sensors, temperature sensors, heart rate monitors, etc.) as a user performs athletic movements and comparing the data to comparison data for a plurality of game types to determine a particular one of the game types that most closely matches the data.
In some example aspects, systems, computer-readable media, and methods may include receiving data generated by a force sensor indicative of a weight distribution over the course of a plurality of exercise tasks, processing a first input indicative of successful completion of an exercise task, associating a first weight distribution at a time prior to the first input with successful completion of the exercise task, processing a second input indicative of unsuccessful completion of the exercise task, and associating a second weight distribution at a time prior to the second input with unsuccessful completion of the exercise task.
In some example aspects, systems, computer-readable media, and methods may include receiving sign action data corresponding to acceleration and force measurement data measured by a first user performing a series of events, receiving player data from at least one of an accelerometer and a force sensor by monitoring a second user attempting to perform the series of events, and generating a similarity indicator indicating how similar the player data is to the sign action data.
In some example aspects, systems, computer-readable media, and methods may include data generated by at least one of an accelerometer and a force sensor, comparing the data to jump data to determine that the data is consistent with a jump, processing the data to determine a jump-off time, a landing time, and a rise (lift) time, and calculating a vertical jump height based on the rise time.
Throughout this disclosure, other aspects and features will be described.
Drawings
For an understanding of example embodiments, reference will now be made, by way of example, to the following drawings in which:
1A-B illustrate examples of a personal training system according to example embodiments.
2A-B illustrate example embodiments of a sensor system according to example embodiments.
3A-B illustrate examples of a computer interacting with at least one sensor according to example embodiments.
FIG. 4 illustrates an example of a round pod sensor that may be embedded in and removed from a shoe, according to an example embodiment.
FIG. 5 illustrates an example on-body configuration of a computer, according to an example embodiment.
Fig. 6-7 illustrate various example off-body configurations of a computer according to example embodiments.
FIG. 8 illustrates an example display of a Graphical User Interface (GUI) presented via a display screen of a computer according to an example embodiment.
FIG. 9 illustrates an example performance metric selected by a user according to an example embodiment.
10-11 illustrate examples of calibrating a sensor according to example embodiments.
FIG. 12 illustrates an example display of a GUI presenting information relative to an activity period according to an example embodiment.
FIG. 13 illustrates an example display of a GUI providing users information about their performance metrics during a project according to an example embodiment.
Fig. 14 illustrates an example display of a GUI presenting information about a user's virtual business card (vcard), according to an example embodiment.
FIG. 15 illustrates an example user profile display of a GUI presenting a user profile according to an example embodiment.
FIG. 16 illustrates yet another example of a user profile display presenting additional information about a user, according to an example embodiment.
17-20 illustrate yet another example display of a GUI for displaying performance metrics to a user according to an example embodiment.
FIG. 21 illustrates an example freeform display of a GUI providing freeform user movement information according to an example embodiment.
FIG. 22 illustrates an example training display presenting user selectable training activity periods in accordance with an example embodiment.
23-26 illustrate example training activity periods in accordance with example embodiments.
27-30 illustrate display screens of a GUI for a basketball shot training session according to an example embodiment.
FIG. 31 illustrates an example display of a GUI notifying a user of a shooting milestone, according to an example embodiment.
FIG. 32 prompts an example signage campaign display of a GUI for prompting a user to conduct a signage campaign workout that mimics a professional athlete, according to an example embodiment.
FIG. 33 illustrates an example display of a GUI searching for comparisons of performance metrics by other users and/or professional athletes according to an example embodiment.
34-35 illustrate example displays for comparing a user's performance metrics with other individuals according to example embodiments.
Fig. 36 illustrates a flowchart of an example method for determining whether obtained physical data monitoring physical activity performed by a user is within a performance zone, according to an example embodiment.
Fig. 37 illustrates two example GUI displays for identifying nearby basketball courts.
FIG. 38 illustrates an example GUI for obtaining activity information about other participants.
Fig. 39 illustrates a process that may be used to find a location for a sporting activity according to an embodiment.
FIG. 40 illustrates a process of sharing performance data according to an embodiment of the invention.
FIG. 41 illustrates a process that may be used to track and compare performance data according to an embodiment of the invention.
Fig. 42 is a flow diagram of an example method that can be utilized in accordance with various embodiments.
43-50 illustrate example displays of a GUI that may be utilized to review and edit acquired images, in accordance with various embodiments.
51-91 illustrate example displays of tree diagrams that can be utilized for various skill types according to various embodiments.
Detailed Description
In the following description of the various embodiments, reference is made to the accompanying drawings, which form a part hereof, and in which is shown by way of illustration various embodiments in which the disclosure may be practiced. It is to be understood that other embodiments may be utilized and structural and functional modifications may be made without departing from the scope and spirit of the present disclosure. Further, headings within the disclosure should not be considered limiting aspects of the disclosure. Those skilled in the art, having benefit of this disclosure, will appreciate that the example embodiments are not limited to the example headings.
I. Example personal training System
A. Illustrative computing System
FIG. 1A illustrates an example of a personal training system 100, according to an example embodiment. The example system 100 may include one or more electronic devices, such as a computer 102. The computer 102 may comprise a mobile terminal such as a telephone, music player, tablet, netbook, or other portable device. In other embodiments, computer 102 may include a set-top box (STB), a desktop computer, a digital video recorder (DVR, one or more), a computer server(s), and/or any other desired computing device. In some embodiments, the computer 102 may include a game console, such as, for example,
Figure RE-GDA0003956018960000051
XBOX、
Figure RE-GDA0003956018960000052
a gaming station, and/or
Figure RE-GDA0003956018960000053
Wii game consoles. Those skilled in the art will appreciate that these are merely example consoles for illustrative purposes, and the present disclosure is not limited to any console or device.
Turning briefly to FIG. 1B, the computer 102 may include a computing unit 104, which may include at least one processing unit 106. The processing unit 106 may be any type of processing device for executing software instructions, such as, for example, a microprocessor device. Computer 102 may include a variety of non-transitory computer-readable media, such as memory 108. Memory 108 may include, but is not limited to, random Access Memory (RAM), such as RAM 110, and/or Read Only Memory (ROM), such as ROM 112. The memory 108 may include: electronically erasable programmable read-only memory (EEPROM), flash memory or other memory technology, CD-ROM, digital Versatile Disks (DVD) or other optical disk storage, magnetic disk storage devices, or other media which can be used to store the desired information and which can accessed by computer 102.
The processing unit 106 and the system memory 108 may be connected, directly or indirectly, to one or more peripheral devices through a bus 114 or alternate communication structure. For example, the processing unit 106 or the system memory 108 may be directly or indirectly connected to additional memory storage, such as a hard disk drive 116, a removable magnetic disk drive, an optical disk drive 118, and flash memory cards. The processing unit 106 and the system memory 108 may also be connected, directly or indirectly, to one or more input devices 120 and one or more output devices 122. Output device 122 may include, for example, a display device 136, a television, a printer, a stereo, or speakers. In some embodiments, one or more display devices may be incorporated in the eyewear. A display device incorporated into the glasses may provide feedback to the user. Eyewear incorporating one or more display devices is also provided for a portable display system. The input device 120 may include, for example, a keyboard, a touch screen, a remote control pad, a pointing device (such as a mouse, touch pad, stylus, trackball, or joystick), a scanner, a camera, or a microphone. In this regard, the input device 120 may include one or more sensors configured to sense, detect, and/or measure physical activity from a user, such as the user sensors shown in fig. 1A.
Referring again to fig. 1A, image capture device 126 and/or sensor 128 may be utilized in detecting and/or measuring physical activity of user 124. In one embodiment, the data obtained from image capture device 126 or sensor 128 may directly detect athletic movement such that the data obtained from image capture device 126 or sensor 128 is directly related to the movement parameter. However, in other embodiments, data from image acquisition device 126 and/or sensor 128 may be utilized in combination with each other, or in combination with other sensors, to detect and/or measure motion. Thus, certain measurements may be determined by combining data obtained by two or more devices. Image capture device 126 and/or sensor 128 include or are operatively connected to one or more sensors, including but not limited to: accelerometers, gyroscopes, position determining devices (e.g. GPS), light sensorsA temperature sensor (including ambient temperature and/or body temperature), a heart rate monitor, an image acquisition sensor, a humidity sensor, and/or combinations thereof. Example uses of illustrative sensors 126, 128 are provided below in section i.c entitled "illustrative sensors". The computer 102 may also use a touch screen or image capture device to determine from the user graphical interface where the user is pointing to make the selection. One or more embodiments may utilize one or more wired and/or wireless technologies, alone or in combination, examples of which include
Figure RE-GDA0003956018960000061
The technology,
Figure RE-GDA0003956018960000062
Low energy consumption technology, and/or ANT technology.
B. Illustrative network
The computer 102, computing unit 104, and/or any other electronic device may be directly or indirectly connected to one or more network interfaces, such as an example interface 130 (shown in fig. 1B) for communicating with a network, such as a network 132. In the example of fig. 1B, network interface 130 may include a network adapter or Network Interface Card (NIC) configured to convert data and control signals received from computing unit 104 into network messages according to one or more communication protocols, such as Transmission Control Protocol (TCP), internet Protocol (IP), and User Datagram Protocol (UDP). These protocols are well known in the art and, as such, will not be discussed in detail herein. The interface 130 may employ any suitable connection agent for connecting to a network including, for example, a wireless transceiver, a power line adapter, a modem, an ethernet connection. However, the network 132 may be one or more information distribution networks of any type or topology, alone or in combination, such as the internet(s), intranet(s), cloud(s), local area network(s). The network 132 may be any one or more of cable, fiber optic, satellite, telephone, cellular, wireless, etc. Networks are well known in the art and, as such, will not be discussed in detail herein. The network 132 may be variously configured, such as having one or more wired or wireless communication channels, connecting one or more locations (e.g., schools, businesses, homes, consumer residences, network resources, etc.), to one or more remote servers 134, or to other computers, such as computers similar to, or the same as, the computer 102. Indeed, the system 100 may include more than one instance of each component (e.g., more than one computer 102, more than one display 136, etc.).
Regardless of whether the computer 102 or other electronic devices within the network 132 are portable or in a fixed location, it should be appreciated that in addition to the input, output, and storage peripherals specifically listed above, the computer device may be connected to a variety of other peripherals, either directly or through the network 132, including some peripherals that perform input, output, and storage functions, or a combination thereof. In certain embodiments, a single device may integrate one or more of the components shown in FIG. 1A. For example, a single device may include the computer 102, the image acquisition device 126, the sensor 128, the display 136, and/or additional components. In one embodiment, the sensor device 138 may comprise a mobile terminal having a display 136, an image capture device 126, and one or more sensors 128. However, in another embodiment, the image capture device 126, and/or the sensor 128, may be peripheral devices configured to be operatively connected to a media device, including, for example, a game or media system. Thus, from the foregoing, it will be appreciated that the present disclosure is not limited to fixed systems and methods. Rather, some embodiments may be implemented by user 124 in almost any location.
C. Illustrative sensor
Computer 102 and/or other devices may include one or more sensors 126, 128 configured to detect and/or monitor at least one fitness parameter of user 124. The sensors 126 and/or 128 may include, but are not limited to: an accelerometer, a gyroscope, a location-determining device (e.g., GPS), a light sensor, a temperature sensor (including ambient and/or body temperature), a sleep mode sensor, a heart rate monitor, an image acquisition sensor, a humidity sensor, and/or combinations thereof. The network 132 and/or the computer 102 may communicate with one or more electronic devices of the system 100, including, for example, a display 136, an image capture device 126 (e.g., one or more video cameras), and a sensor 128, which may be an Infrared (IR) device. In one embodiment, the sensor 128 may comprise an IR transceiver. For example, sensors 126, and/or 128 may transmit waveforms into the environment, including in the direction toward user 124, and receive "reflections" or otherwise detect changes in these released waveforms. In yet another embodiment, image-capturing device 126 and/or sensor 128 may be configured to transmit and/or receive other wireless signals, such as radar, sonar, and/or other acoustic information. Those skilled in the art will readily appreciate that signals corresponding to a large number of data spectra may be utilized in accordance with various embodiments. In this regard, the sensors 126 and/or 128 may detect waveforms emanating from an external source (e.g., not the system 100). For example, sensors 126 and/or 128 may detect heat emitted by user 124 and/or the surrounding environment. Thus, image capture device 126 and/or sensor 128 may include one or more thermal imaging devices. In one embodiment, the image acquisition device 126 and/or the sensor 128 may include an IR device configured to perform range phenomenology (phenomenology). As a non-limiting example, an image acquisition device configured to perform range phenomenology is commercially available from air Systems, inc. of Portland, oregon. Although image capture device 126 and sensor 128 and display 136 are shown as communicating directly (wireless or wired) with computer 102, one skilled in the art will appreciate that any of the above may be in direct communication (wireless or wired) with network 132.
1. Multipurpose electronic equipment
User 124 may own, carry, and/or wear any number of electronic devices, including sensing devices 138, 140, 142, and/or 144. In certain embodiments, one or more of the devices 138, 140, 142, 144 may be specifically manufactured for fitness or athletic purposes. Indeed, aspects of the present disclosure relate to utilizing information from multiple sourcesData from a plurality of devices, some of which are not exercise devices, to collect, detect, and/or measure athletic data. In one embodiment, device 138 may comprise a portable electronic device, such as a telephone or digital music player, including
Figure RE-GDA0003956018960000081
Or
Figure RE-GDA0003956018960000082
Commercially available from apple Inc. of Hippotini, calif., or
Figure RE-GDA0003956018960000083
Or commercially available from Microsoft corporation of Redmond, washington
Figure RE-GDA0003956018960000084
Windows device. As is known in the art, a digital media player may be used as both an output device (e.g., outputting music from a sound file or outputting pictures from an image file) and a storage device for a computer. In one embodiment, the device 138 may be the computer 102, but in other embodiments the computer 102 may be completely different from the device 138. Regardless of whether device 138 is configured to provide some output, it may still be used as an input device for receiving sensory information. Devices 138, 140, 142, and/or 144 may include one or more sensors including, but not limited to: an accelerometer, a gyroscope, a location-determining device (e.g., GPS), a light sensor, a temperature sensor (including ambient and/or body temperature), a heart rate monitor, an image acquisition sensor, a humidity sensor, and/or combinations thereof. In some embodiments, the sensor may be passive, such as a reflective material that is detectable by the image capture device and/or the sensor 128 (among other things). In certain embodiments, the sensor 144 may be integrated into a garment, such as an athletic garment. For example, the user 124 may wear one or more on-body (on-body) sensors 144a-b. Sensor 144 may be incorporated into the clothing of user 124, and/or placed on any of the body of user 124At the desired location. The sensor 144 may be in communication (e.g., wireless) with the computer 102, the sensors 128, 138, 140, and 142, and/or the camera 126. An example of interactive game apparel is described in U.S. patent application Ser. No. 10/286,396, filed on 30/10/2002, and published as U.S. patent publication No. 2004/0087366, the beauty of which is incorporated herein by reference in its entirety for any or all non-limiting purposes. In certain embodiments, the passive sensing surface may reflect waveforms, such as infrared light emitted by image acquisition device 126 and/or sensor 128. In one embodiment, the passive sensors located on the apparel of user 124 may include a generally spherical structure made of glass or other transparent or translucent surface that may reflect waveforms. Different categories of apparel may be utilized, where a given category of apparel has a particular sensor configured to be positioned proximate to a particular portion of the body of user 124 when properly worn. For example, golf apparel may include one or more sensors positioned on the apparel in a first configuration, and football apparel may in turn include one or more sensors positioned on the apparel in a second configuration.
The devices 138-144, as well as any other electronic devices disclosed herein, including any sensor devices, may communicate with each other directly or through a network, such as the network 132. Communication between one or more devices 138-144 may occur via computer 102. For example, two or more of the devices 138-144 may be peripheral devices that are operatively coupled to the bus 114 of the computer 102. In yet another embodiment, a first device, such as device 138, may communicate with a first computer, such as computer 102, and another device, such as device 142, however, device 142 may not be configured to connect to computer 102 but may communicate with device 138. Further, one or more devices may be configured to communicate over multiple communication paths. For example, the device 140 may be configured to communicate with the device 138 via a wireless communication protocol and further communicate with a different device, such as, for example, the computer 102, via a second wireless communication protocol. Example wireless protocols are discussed throughout this disclosure and are known in the art. Those skilled in the art will appreciate that other configurations are possible.
Some implementations of the example embodiments may alternatively or additionally employ a computing device intended to be capable of various functions, such as a desktop computer or a notebook personal computer. These computing devices may have any combination of peripherals or additional components, as desired. Also, the components shown in FIG. 1B may be included in a server 134, other computer, device, or the like.
2. Illustrative apparel/accessory sensors
In certain embodiments, sensing devices 138, 140, 142, and/or 144 are formed within clothing or accessories of user 124 or otherwise associated with user 124, including watches, arm bands, wristbands, necklaces, shirts, shoes, and the like. The shoe-mounted sensor and wrist-worn device ( devices 140 and 142, respectively) will be described immediately below, however, these are merely example embodiments and the disclosure should not be limited thereto.
i. Shoe installation device
In some embodiments, the sensing device 140 may comprise footwear that includes one or more sensors, including but not limited to: an accelerometer, a position sensing component, such as a GPS, and/or a force sensor system. Fig. 2A illustrates one example embodiment of a sensor system 202 according to an example embodiment. In certain embodiments, the system 202 may include a sensor assembly 204. The component 204 may include one or more sensors, such as, for example, accelerometers, position determining components, and/or force sensors. In the illustrated embodiment, the component 204 incorporates a plurality of sensors, which may include Force Sensitive Resistor (FSR) sensors 206. In yet another embodiment, other sensor(s) may be utilized. The port 208 may be positioned within a sole structure 209 of the footwear. The port 208 is optionally provided in communication with an electronics module 210 (which may be in the housing 211), and a plurality of leads 212 connecting the FSR sensor 206 to the port 208. The module 210 may be contained in a recess or chamber in the sole structure of the footwear. The port 208 and the module 210 include complementary interfaces 214, 216 for connection and communication.
In certain embodiments, the at least one force sensitive resistor 206 shown in FIG. 2A may include first and second electrodes or electrical contacts 218, 210, and a force sensitive resistive material 222 disposed between the electrodes 218, 220 to electrically connect the electrodes 218, 220 together. When pressure is applied to the force-sensitive material 222, the resistivity and/or conductivity of the force-sensitive material 222 changes, thereby changing the electrical potential between the electrodes 218, 220. The change in resistance may be detected by the sensor system 202, thereby detecting the force exerted on the sensor 216. The force-sensitive resistive material 222 can change its resistance under pressure in a variety of ways. For example, the force-sensitive material 222 can have an internal resistance that decreases when the material is compressed, similar to a quantum tunneling composites (QTUNNING composites) described in more detail below. Further compression of such materials may further reduce resistance, allowing quantitative measurements, as well as binary (on/off) measurements. In some cases, this type of force sensitive resistive behavior may be described as "volume-based resistance," and materials exhibiting this behavior may be referred to as "smart materials. As another example, the material 222 may change resistance by changing the degree to which the surface is in contact with the surface. This can be achieved in a number of ways, such as by micro-protrusions on the surface that increase the surface resistance in an uncompressed condition, where the surface resistance decreases when the micro-protrusions are compressed, or by using a flexible electrode that can deform to create an increased surface-to-surface contact with another electrode. Such a surface resistance may be a surface resistance between the material 222 and the electrodes 218, 220, and/or a surface resistance between a conductive layer (e.g., carbon/graphite) of the multi-layer material 222 and a force-sensitive layer (e.g., semiconductor). The more compression, the more surface-to-surface contact, resulting in less resistance and enabling quantitative measurements. In some cases, this type of force-sensitive resistive behavior may be described as "contact-based resistance. It is understood that the force sensitive resistive material 222 as defined herein may be a doped or undoped semiconductor material.
The electrodes 218, 220 of the FSR sensor 206 may be formed of any conductive material, including metals, carbon/graphite fibers or composites, other conductive composites, conductive polymers or polymers containing conductive materials, conductive ceramics, doped semiconductors, or any other conductive material. The lead 212 may be connected to the electrodes 218, 220 using any suitable method, including welding, soldering, welding, adhesive bonding, fasteners, or any other integral or non-integral bonding method. Alternatively, the electrodes 218, 220 and associated leads 212 may be formed from a single sheet of the same material.
Other embodiments of the sensor system 202 may include a different number and/or configuration of sensors and generally include at least one sensor. For example, in one embodiment, system 202 includes a greater number of sensors, and in another embodiment, system 202 includes two sensors, one in the heel and one in the forefoot of the shoe, or in a device in close proximity to the user's foot. Additionally, the one or more sensors 206 may communicate with the port 214 in different ways, including any known type of wired or wireless communication, including bluetooth or near field communication. A pair of shoes may be provided with sensor systems 202 in each shoe, and it is understood that pairs of sensor systems may operate cooperatively, or may operate independently of each other, and that the sensor systems in each shoe may or may not be in communication with each other. It is also understood that sensor system 202 may be provided with computer-executable instructions stored on one or more computer-readable media that, when executed by a processor, control the collection and storage of data (e.g., pressure data from the interaction of a user's foot with the ground or other contact surface), and that these executable instructions may be stored in or executed by sensor 206, any module, and/or external devices, such as device 128, computer 102, server 134, and/or network 132 in fig. 1.
ii. Wrist-worn device
As shown in fig. 2B, the device 226 (which may be similar to or the sensing device 142 shown in fig. 1A) may be configured to be worn by the user 124, such as around a wrist, arm, ankle, or the like. Device 226 may monitor the user's athletic movements, including all-weather activities of user 124. In this regard, the device component 226 can detect athletic movement during interaction of the user 124 with the computer 102 and/or during operation independent of the computer 102. For example, in one embodiment, device 226 may be an all-weather interaction monitor that measures activity regardless of whether the user is near or interacting with computer 102. Device 226 may communicate directly with network 132 and/or other devices, such as devices 138 and/or 140. In other embodiments, the athletic data obtained from the device 226 may be utilized in determinations implemented by the computer 102, such as determinations relating to which exercise programs are presented to the user 124. In one embodiment, device 226 may also interact wirelessly with a mobile device, such as device 138 associated with user 124 or a remote site, such as a site dedicated to fitness or health-related topics. At some predetermined time, the user may wish to transfer data from the device 226 to another location.
As shown in fig. 2B, the device 226 may include an input mechanism, such as a depressible input button 228, that facilitates operation of the device 226. The input buttons 228 are operatively connected to a controller 230 and/or any other electronic components, such as one or more elements discussed with respect to the computer 102 shown in fig. 1B. The controller 230 may be embedded or otherwise made part of the housing 232. Housing 232 may be formed from one or more materials, including elastomeric components, and include one or more displays, such as display 234. The display may be considered an illuminable portion of the device 226. The display 234 may include a series of individual (individual) lighting elements or lamp components, such as LED lamps 234 in the exemplary embodiment. The LEDs, etc. may be formed in an array and operatively connected to the controller 230. The device 226 may include an indicator system 236, which may also be considered a part or component of the overall display 234. It is understood that the indicator system 236 may operate and illuminate in conjunction with the display 234 (which may have pixel components 235), or may be completely separate from the display 234. The indicator system 236 may also include a plurality of additional illuminating elements or light members 238, which may also take the form of LEDs or the like in the exemplary embodiment. In some embodiments, the indicator system may provide a virtual indication of the target, such as by illuminating (illuminating) a portion of the lighting assembly 238 to represent completion toward one or more targets.
The securing mechanism 240 may be unlatched, wherein the device 226 may be positioned around the wrist of the user 124, and the securing mechanism 240 may then be placed in the latched position. The user may wear the device 226 at any time, if desired. In one embodiment, the fastening mechanism 240 may include an interface, including but not limited to a USB port, for operative interaction with the computer 102 and/or the devices 138, 140.
In certain embodiments, the device 226 may include a sensor assembly (not shown in fig. 2B). The sensor assembly may include a plurality of different sensors. In an example embodiment, the sensor component may include or permit operative connections to an accelerometer (including forms of multi-axis accelerometers), a heart rate sensor, a location determination sensor (such as a GPS sensor), and/or other sensors. The motion or parameters detected from one or more sensors of the device 142 may include (or may be used to develop) various parameters, metrics, or physiological characteristics including, but not limited to, speed, distance, number of steps taken (steps taken), calories, heart rate, sweat detection, effort, oxygen consumed, and/or oxygen kinetic energy. Such parameters may also be expressed in terms of activity points or currency earned by the user based on the user's activities.
Various examples may be implemented using electronic circuitry configured to perform one or more functions. For example, with embodiments of the present invention, a computing device, such as a smartphone, mobile device, computer, server, or other computing equipment that may be executed using one or more Application Specific Integrated Circuits (ASICs). More typically, however, components of various examples of the invention will be implemented using, or by a combination of special purpose electronic circuitry and firmware or software instructions executing on, a programmable computing device.
II. Monitoring system
3A-B illustrate examples of a computer interacting with at least one sensor according to example embodiments. In the illustrated example, the computer 102 may be implemented as a smartphone, which may be carried by a user. Example sensors may be worn by the body of the user, located off-body, and may include any of the sensors discussed above, including accelerometers, distributed sensors, heart rate monitors, temperature sensors, and the like. In fig. 3, a pod sensor 304 and a distributed sensor 306 (including, for example, the sensor system 202 discussed above with one or more FSRs 206) are shown. Pod sensors 304 may include accelerometers, gyroscopes, and/or other sensing technologies. In some examples, the pod sensor 304 may be at least one sensor that monitors data that is not directly related to the user's motion. For example, environmental sensors (ambient sensors) may be worn by the user or may be external to the user. The environmental sensor may include a temperature sensor, compass, barometer, humidity sensor, or other type of sensor. Other types of sensors and combinations of sensors configured to measure user motion may also be used. Also, the computer 102 may incorporate one or more sensors.
The pod sensors 304, the distributed sensors 206, and other types of sensors, may include wireless transceivers to communicate with each other and with the computer 102. For example, the sensors 304 and 306 may communicate directly with the network 132, other devices worn by the user (e.g., a watch, an armband device, etc.), sensors or devices worn by a second user, external devices. In an example, a sensor in a left shoe may communicate with a sensor in a right shoe. Also, a shoe may include multiple sensors that communicate with each other and/or with a processor of the shoe. Further, a pair of shoes may include a single processor that collects data from a plurality of sensors associated with the shoes, and a transceiver coupled to the single processor that may communicate the sensor data to at least one of the computer 102, the network 132, and the server 134. In another example, one or more sensors of the shoe may be in communication with a transceiver that is in communication with at least one of the computer 102, the network 132, and the server 132. Further, a sensor associated with a first user may be in communication with a sensor associated with a second user. For example, a sensor in a first user's shoe may communicate with a sensor in a second user's shoe. Other suitable topologies may also be used.
The computer 102 may exchange data with the sensors and may also communicate data received from the sensors to the server 134 and/or another computer 102 via the network 132. A user may wear headphones or earpieces to receive audio information from the computer 102, directly from one or more sensors, from the server 134, from the network 132, from other locations, and combinations thereof. The headset may be wired or wireless. For example, distributed sensor 306 may communicate data to a headset for audible output to a user.
In an example, a user may wear shoes each equipped with an accelerometer, force sensor, etc., to allow the computer 102 and/or server 134 to determine, alone or in combination with the systems described above with reference to fig. 1A-B and 2A-2B, an indicator of individual motion and of each foot or other body part (e.g., leg, hand, arm, individual finger or toe, region of a person's foot or leg, hip, chest, shoulder, head, glasses).
The processing of the data may be distributed in any manner, or performed entirely at one shoe, at the computer 102, in the server 134, or a combination thereof. In the description that follows, the computer 102 may be described as performing functions. Other devices, including a server 134, a controller, another computer, a processor in a shoe or other article of clothing, or other device may perform functions in place of or in addition to the computer 102. For example, one or more sensors (or other peripheral sensors) in each shoe may cooperate with a respective local controller that performs some or all of the processing of the raw signals output by the one or more sensors. The processing of the controller, at any given time, is commanded and controlled by a higher level computing device (e.g., computer 102). Higher level devices may receive and further process the processed sensor signals from one or more controllers, e.g., via one or more transceivers. The comparison and calculations may be performed at one or more computing devices, including some or all of the computing devices described above, with or without additional computing devices. The sensors may sense a desired condition and generate raw signals that are processed in order to provide processed data. The processed data may then be used to determine current performance metrics (e.g., speed of current travel, etc.), and the amount of determination may vary depending on user input (e.g., how high did i jump.
In an example, the sensors 304 and 306 can process and store the measurement data and forward the processed data (e.g., average acceleration, top velocity, total distance, etc.) to the computer 102 and/or the server 134. The sensors 304 and 306 may also send the raw data to the computer 102 and/or server 134 for processing. The raw data, for example, may include time-varying acceleration signals measured by an accelerometer, time-varying pressure signals measured by a pressure sensor, and the like. Examples of multisensor (multi-SENSOR) apparel and the use of multiple SENSORs in athletic activity monitoring are described in U.S. application No. 12/483,824, published as U.S. publication No. 2010/0063778 A1 entitled "FOOTWEAR with SENSOR SYSTEM," and U.S. application No. 12/483,828, published as U.S. publication No. 2010/0063779 A1 entitled "FOOTWEAR with SENSOR SYSTEM. The contents of the above-referenced applications are incorporated by reference herein in their entirety. In a particular example, an athlete may wear a shoe 302 having one or more force sensing systems, for example, utilizing Force Sensitive Resistor (FSR) sensors, as shown in fig. 2 and described in the above-mentioned patent publications. The shoe 302 may have multiple FSR sensors 206 that detect forces at different areas of the user's foot (e.g., heel, midsole, toe, etc.). The computer 102 may process data from the FSR sensor 206 to determine the user's foot and/or the balance between the user's feet. For example, computer 102 may compare force measurements from the left foot by FSR 206 against force measurements from the right foot by FSR 206 to determine balance and/or weight distribution.
FIG. 3B is another example data flow diagram in which the computer 102 interacts with at least one sensor processing system 308 to detect user actions. The sensor processing system 308 may be physically separate and distinct from the computer 102 and may communicate with the computer 102 through wired or wireless communication. The sensor processing system 308 may include the sensor 304, as shown, and other sensors (e.g., sensor 306) in place of or in addition to the sensor 304. In the illustrated example, the sensor system 308 may receive and process data from the sensors 304 and the FSR sensor 206. The computer 102 may receive input from the user regarding the type of activity item (session; e.g., cross-training, basketball, running, etc.) the user wishes to perform. Alternatively or additionally, the computer 102 may detect the type of activity the user is performing or receive information from another source regarding the type of activity the user has performed.
Based on the activity type, the computer 102 may identify one or more predefined action templates and communicate a subscription (subscription) to the sensor system 308. The action template may be used to identify a motion or action that the user would perform when performing the determined type of activity. For example, an action may correspond to a set of one or more events, such as detecting that a user has moved one step to the right and then one step to the left, or detecting that a user has jumped when shaking his or her wrist. Thus, different sets of one or more action templates may be defined for different types of activities. For example, a first set of action templates defined for a basketball may include dribbling, shooting, snapping, making a dunk, sprinting, and the like. A second set of action templates defined for a soccer ball may include kicking the ball to cause a shot, dribbling, breaking, heading, etc. The action template may correspond to any desired level of scatter size (granularity). In some examples, a particular type of activity may include 50-60 templates. In other examples, one type of activity may correspond to 20-30 templates. Any number of templates may be defined as desired for a type of activity. In still other examples, the template may be selected manually by the user, rather than by the system.
The sensor subscription may allow the sensor system 308 to select the sensors from which to receive data. The sensor processing system 308 can manage subscriptions that are used at any particular time. The type of subscription may include force sensitive resistance data from one or more force sensitive resistors, acceleration data from one or more accelerometers, summation information for a plurality of sensors (e.g., a sum of acceleration data for one or more sensors, a sum of force resistance data, etc.), pressure map, mean center data, gravity adjustment sensor data, force sensitive resistance derivatives, acceleration derivatives, etc., and/or combinations thereof. In some examples, a single subscription may correspond to a summation (summation) of data from multiple sensors. For example, if the template requires a force transfer to the forefoot region of the user's foot, a single subscription may correspond to the sum of the forces of all sensors in the forefoot region. Alternatively or additionally, the force data for each of the forefoot force sensors may correspond to different subscriptions.
For example, if the sensor system 308 includes 4 force sensitive resistive sensors and accelerometers, the subscription may specify which of the 5 sensors described above is monitored for sensor data. In another example, the subscription may specify to receive/detect sensor data from a right shoe accelerometer, rather than a left shoe accelerometer. In yet another example, the subscription may include monitoring data from a wrist worn sensor, rather than a heart rate sensor. The subscription may also specify sensor thresholds to adjust the sensitivity of the event detection process of the sensor system. Thus, in some activities, the sensor system 308 may be instructed to detect all force peaks above a first specified threshold. For other activities, the sensor system 308 may be instructed to detect all force peaks above a second specified threshold. The use of different sensor subscriptions may help the sensor system to save power if some sensor readings are not needed for a particular activity. Thus, different activities and activity types may use different sensor subscriptions.
The sensor processing system 308 may be configured to perform initial processing of the raw data to detect various discrete (granular) events. Examples of events may include foot landing or take-off when jumping, maximum acceleration during a certain period of time, and so forth. The sensor system 308 may then pass the event to the computer 102 for comparison with various templates to determine if an activity has been performed. For example, the sensor system 308 may identify one or more events and communicate Bluetooth Low Energy (BLE) data packets or other types of data to the computer 102. In another example, the sensor system 308 may instead or additionally transmit raw sensor data.
After receiving the events and/or raw sensor data, the computer 102 may perform a post-matching process, including determining various activity metrics, such as number of repetitions, start time (air-time), speed, distance, and so forth. The activity classification may be performed by identifying various events and activities represented within data received from any number and type of sensors. Thus, activity tracking and monitoring may include determining whether one or more expected or known actions within an activity type have been performed, and indicators associated with those actions. In one example, an activity may correspond to a series of one or more low-level or discrete events and may be detected using a predefined activity template.
For example, using an activity template, the computer 102 may automatically detect when the user has performed a particular activity or performed a particular motion desired during the activity. If the user is playing basketball, for example, detecting that the user has jumped when shaking his or her wrist indicates that the user has made a shot. In another example, detecting that the user has moved both feet outward at a jump, followed by moving both feet inward at the jump, may register (register) one repetition of a jump open (jumping jack) exercise for the user. Various other templates may be defined as desired to identify a particular type of activity, or an action, or motion, in that type of activity.
FIG. 4 illustrates an example of a round pod sensor 304 that may be embedded in and removed from a shoe, according to an example embodiment. The pod sensor 304 may include a rechargeable battery that can be recharged when inserted into the wall adapter. Wired or wireless charging of the pod sensors 304 may be used. For example, the pod sensor 304 may be inductively (inductively) charged. In some examples, the pod sensor 304-1 may be configured with an interface (e.g., a universal serial bus) permitting insertion into a computer or other device for downloading and/or receiving data. The pod sensor interface may provide wired or wireless communication. For example, a pod sensor may have software updates loaded thereon when connected to a computer. Additionally, the pod sensor may receive software updates wirelessly. When physically coupled to the computer 102 (or other device having a port), the pod sensor may be rechargeable and in communication with the computer 102.
FIG. 5 illustrates an example in-vivo configuration of the computer 102, according to an example embodiment. The computer 102 may be configured to be worn at a desired location on the user's body, such as, for example, the user's arms, legs, or chest, or otherwise integrated into a garment. For example, each article of apparel may have its own integrated computer. The computer may be a thin client, driven by the user's ongoing and otherwise equipped/networked environment (context). The computer 102 may also be located remotely from the user's body, as shown in fig. 6-7.
6-7 illustrate various example off-body configurations of the computer 102, according to example embodiments. The computer 102 may be placed in the docking station 602 to permit the GUI to be displayed on a larger screen and the audio to be output by the stereo system. As in other examples, the computer 102 may receive user commands via direct user input (e.g., using a keyboard), via from a remote control, or otherwise, in response to voice commands. Other off-body configurations may include placing the computer 102 on the floor, or near a table, where the user is exercising, storing the computer 102 in a quiz bag or other storage container, placing the computer 102 on the tripod 702, and placing the computer 102 on the wall mount 704. Other ex vivo configurations may also be used. When worn ex vivo, the user may wear headphones, earplugs, wrist worn devices, etc., which may provide real-time updates to the user. The pod sensors 304 and/or the distributed sensors 306 may communicate wirelessly with the computer 102 at off-body locations at periodic intervals when in range and triggered by a user, and/or may store and upload data to the computer 102 when in range or at a later time as instructed by a user.
In an example, a user may interact with a Graphical User Interface (GUI) of the computer 102. FIG. 8 illustrates an example display of a GUI presented by a display screen of computer 102, according to an example embodiment. The home page display 802 of the GUI may present a home page to provide the user with general information to prompt the user to select the type of physical activity item the user is interested in performing and to permit the user to retrieve information about previously completed items (e.g., basketball games, quizzes, etc.). The display screen of the computer 102 is touch sensitive and/or may receive user input through a keyboard or other input device. For example, a user may click on a display screen or provide other input to cause the computer 102 to operate.
To obtain information about previous items, a user may click or otherwise select field 804, which includes the last item, to cause computer 102 to update home page display 802 to display performance metrics (e.g., vertical jump, total dead space, activity points, etc.) from at least one previous item. For example, as seen in fig. 8, the selected field 804 may expand to display information regarding the duration of the last project, the highest vertical jump of the user, the total amount of time the user stayed clear during the last project, and the number of incentive points earned in previous projects (e.g., activity points). The computer 102 may determine performance metrics (e.g., speed, vertical jump, etc.) by processing data sensed by the sensors 304 and 306, or other sensing devices.
Home page display 802 may prompt the user to select whether they wish computer 102 to track one or more user performances during a quiz or athletic activity session (e.g., track my games) by selecting field 806, or assist the user in improving their competitive skills (e.g., promoting my games) by selecting field 808. Fig. 9-21 discuss the former, and fig. 22-31 discuss the latter.
FIG. 9 illustrates an example performance metric selected by a user according to an example embodiment. In an example, the user may be interested in monitoring their total game time, vertical jump, distance, or calories burned and/or other metrics, and may use the home page display 802 to select from the desired metrics shown in fig. 9. The metrics may also vary based on the type of athletic activity performed in the project. For example, the home page display 802 may represent some default performance indicator options that depend on the activity of the project. The user may provide input to change the default performance metric option.
In addition to what is shown in fig. 9, other performance indicators may include total number of jumps, number of vertical jumps above a certain height (e.g., above 3 inches), number of sprints (e.g., above a certain rate, or selected by the user, or specified by computer 102), number of false actions (e.g., rapid direction changes), jump recovery (e.g., fastest time between jumps), work rate (e.g., which may be a function of average power times the length of time to test the item), work rate level (e.g., low, medium, high), total number of steps, number of steps per unit of time (e.g., per minute), number of outbreaks (e.g., the number of times the user exceeds a speed threshold), balance, weight distribution (e.g., comparing the weight measured by FSR 206 in the user's left shoe to the weight measured by FSR 206 in the user's right shoe, and the amount of FSR in one shoe), average duration of the item, total time for the item, average number of repetitions per item, average number of exercises taken per item, total number of points earned, total number of calories burned, or other performances. Additional performance indicators may also be used.
In an example, the computer 102 may prompt the user to indicate which metrics to monitor for each type of item (e.g., baseball, football, basketball, etc.), and store the identified metrics in the user profile. The computer 102 may also prompt the user for desired metrics at the beginning of each project. Further, the computer 102 may track all performance metrics, but may only display selected metrics to the user in the GUI. For example, the computer 102 may monitor only certain base indicators (e.g., based on a battery life that may be extended, to change responsiveness, to avoid data overload, etc.). If the user wishes to review other metrics in addition to the metrics currently displayed by the GUI, the user may enter the desired metrics and the computer 102 may update the GUI accordingly. The displayed index may change at any time. When an item continues, or another item begins, a default indicator may be presented.
If the computer 102 monitors more metrics than it can display, the computer 102 may enter a lower level of monitoring later (e.g., as resources are consumed and alerts are given to the user), drop to and through the base metrics, and eventually drop to monitoring only one metric or no monitoring. In an example, the computer 102 may display only the base metrics of the user unless/until the user is otherwise configured. Based on the resource, the computer 102 may be displayed to present only the base performance metric or less. The sensors may continue to monitor other performance indicators and the data for these sensors may be available at a later time (e.g., via a network experience, etc.).
At the beginning of the project, the computer 102 may calibrate the sensors of the shoe. 10-11 illustrate examples of calibrating a sensor according to example embodiments. Calibration may include the ability of the computer 102 to confirm direct or indirect communication with sensors (e.g., sensors 304 and 306), the sensors functioning properly, the sensors having sufficient battery life, and establishing baseline data. For example, the computer 102 may communicate (e.g., transmit wireless signals) with the pod sensors 304 and the distributed sensors 306 contained in the user's shoe. The pod sensors and distributed sensors may reply to the requested data. Calibration may also occur at any other time instance (e.g., in training, at the end of training, etc.).
During the calibration process, the GUI may prompt the user to stand to acquire baseline data measurements (e.g., acceleration, weight distribution, total weight, etc.) with the pod sensors 304 and the distributed sensors 306, as visible in the displays 1002A-B. The calibration may also prompt the user to individually lift their feet to permit the computer 102 to determine which foot is associated with which sensor data. Distributed sensors 306 may also be encoded with footwear information that computer 102 obtains during a calibration process, such as, for example, shoe type, color, size, which foot (e.g., left or right), etc. The computer 102 (or server 134) may process the replies from the sensors 304 and 306 and update the GUI to inform the user of any problems, how to resolve the problems (e.g., replace batteries, etc.), or whether the calibration was successful, as seen in display 1002C. In FIG. 11, for example, a field 1104 shown on the left side of display 1102A includes an example display of battery life, and connection status (e.g., connected, unconnected). Calibration may also occur in certain events, such as detecting removal of the pod sensor 304. On a calibrated basis, display 1102B presents the user's weight distribution along with a scale 1106 representing remaining battery life. As part of calibrating one or more sensors, and/or as a separate feature or function, the GUI may be configured to display performance data in substantially real-time (e.g., may be permitted to transmit the data for display once acquired (and/or processed)). FIG. 11B illustrates an example GUI that can be implemented in accordance with one embodiment. As can be seen in fig. 11B, display 1102C may provide one or more selectable activity parameters for displaying the captured values associated with the selectable parameters. For example, a user who wishes to view information about their vertical height during a jump may select the "vertical" icon (see table 1108); however, other icons may include, but are not limited to: fast speed (quickness) (which may display values for parts per second and/or distance per second), pressure, and/or any other detectable parameter. In other embodiments, multiple different parameters may be selected for simultaneous display by the user. In still other embodiments, the parameters may not be required to be selected. Default parameters may be displayed without user input. Data related to the parameter(s) may be provided in real-time in display 1102C. For example, output 1110 indicates that the user has jumped "24.6 inches". The numerical value may be provided in a graphical manner, such as, for example, represented by graph 112, which indicates a value of 24.6 inches. In some embodiments, the output of values, such as through outputs 1110 and/or 1112, may be real-time data, while in other embodiments at least one of outputs 1110/1112 may display other values, such as historical values, desired target values, and/or maximum or minimum values. For example, the graphic 112 may fluctuate depending on the current (e.g., real-time) altitude of the user; however, the output 1110 may display the user's highest recorded jump in the course of the project or the best value for the entire time. The output of the numerical value or result may be related to the physical object and/or action. For example, when a user jumps by a vertical height within a first range, such as between 24 inches and 30 inches, they may receive an indication that they may skip word formation (see, e.g., display 1102D of fig. 11B). As another example, a numerical value regarding the amount of steps per second of the user may be related to which actual animals and displayed. Those skilled in the art will recognize that other physical objects may be utilized in accordance with different embodiments.
The computer 102 may prompt the user to start the project. FIG. 12 illustrates an example display of a GUI presenting information relative to items according to an example embodiment. Display 1202A may initially prompt the user to check in to the venue and start the project. The user may also enter the type of item (e.g., practice, arcade game, tournament, half-game, full-game, 3-to-3, 5-to-5, etc.). The display 1202B may inform the user of the duration of the item, as well as prompt the user to pause and/or terminate their item. Display 1202C may present the user's current performance metrics (e.g., vertical jump height, dead time (air time), tempo (tempo), etc.). For viewing purposes, the display 1202 may present default or user-selected statistics, but a swipe or other gesture may trigger scrolling, sorting groups of a predetermined number of performance indicators (e.g., 3 or other number, in portrait or landscape orientation based on performance indicators that may be displayed on the screen), or otherwise bringing along other performance indicators.
The computer 102 may also update the display 1202 when a particular event is identified. For example, if a new record (e.g., personal best) is identified (e.g., a new vertical top jump), the computer 1202 may perform at least one of: update a display (e.g., presented information, color, etc.), vibrate, emit a noise indicating a particular record (e.g., based on a color change layout on the shoe corresponding to a particular indicator), or prompt the user that some records (e.g., any indicators) have been reached. Display 1202 may also present a button for user selection, meaning that recording has been achieved. Display 1202B may prompt the user to check their performance metrics (e.g., check my statistics), as will be described in fig. 13.
FIG. 13 illustrates an example display of a GUI providing information about users' performance metrics during a project according to an example embodiment. Display 1302 may present information about the length of a current or previous project in field 1304, various performance indicators of the user (e.g., highest vertical, total dead time, cadence, etc.) in field 1308, and who the user will work with during the project in field 1310. For example, the computer 102, the sensor 304 or 306, or other device associated with the first user may exchange the first user identifier with the computer 102, the sensor 304 or 306, or other device associated with the second user so that each computer may know who participated in the project.
The computer 102 may also process the performance metrics for assignment to the user's style of play, as indicated in field 1306. In response to determining that the user is quickly struggling for thirty minutes (hustled), field 1306 may indicate that the user is a "hot score". The box to the right of field 1306 may indicate an alternate style of play. The computer 102 may identify other types of styles of play. For example, when identifying an inactive phase followed by a sudden break out, the computer 102 may assign a "silent thrill," assign a "whirlpool" play style when the user exhibits little motion or jump in the project, assign a "cobra" play style when the user exhibits constant easy motion and has a large break out and jump, assign a "star-of-track" play style when the user is fast, has good endurance, and has a high peak speed, and assign an "skywalker" when the user has a large vertical jump and a long flight time. In some examples, more than one genre may be assigned to a user, where a genre associated with one individual item is different from a genre associated with another item. Multiple genres may be assigned and displayed for a single item.
The computer 102 may assign a particular style of play based on receiving data from at least one of the pod sensors 304 (e.g., acceleration data), the distributed sensors 306 (e.g., force data), or other sensors. Computer 102 may compare the user data to data for a plurality of different styles of play to determine which style of play most closely matches the data. For example, computer 102 may set performance metric thresholds for each style of game play. Some play styles may require that, at least once in a project, a user jump a certain height, run at a certain speed, play for a certain amount of time, and/or perform other tasks. Other game styles may require that the user data indicate that the user has performed a series of events (e.g., little motion followed by a rapid acceleration to at least some maximum speed). Some game styles may require that the user data indicate that the user has maintained the threshold for a certain amount of time (e.g., average speed is maintained above the threshold throughout the game).
In an example, a style of play may be assigned based on a data set derived from a set of sensors including sensors worn at various locations on a user's body (e.g., accelerometers placed on gluteus muscles and/or upper body to identify a "firecracker" style of play). Also, the inactivity data may be used to determine a style of play, such as user profile data (e.g., user age, height, gender, etc.). For example, some play styles may be gender specific, or based on environmental conditions (e.g., a "duel" style because the user is playing in rain, hail, snow, etc.).
The user or group of users may define their own style of play based on a combination of metrics and analysis. A user or group of users may change the names of the styles of play without changing the associated metrics and analysis. The style of play may be automatically updated. For example, the personal training system 100 may periodically update the style of play specified by the system 100. In another example, the system 100 may automatically update a style of play when the name of the style of play is associated with a particular location (e.g., state, city, court), and the style of play is referred to as a different name at another location (e.g., remaining consistent with local terminology).
In FIG. 13, display 1302 permits a user to share their performance metrics with other users and/or post to a social networking site by selecting field 1312. The user may also enter a message (e.g., "check my vertical jump") to accompany the message being sent. Computer 102 may distribute performance metrics and messages for current and/or previous items to server 134 in response to a user requesting sharing. The server 134 may incorporate data and/or messages in the social networking site and/or may distribute the data/messages to other desired users or to all users.
Fig. 14 illustrates an example display of a GUI presenting information about a user's virtual business card (vcard), according to an example embodiment. The virtual business card may include information about the user's athletic history. The virtual business card may include performance indicators, items, and rewards of the user for individual items, as well as an average of the performance indicators. The virtual business card statistics display 1402A may indicate the number of points (e.g., activity points or metrics) that the user has earned, as well as the user's total and/or highest performance. The performance points may be statistics indicative of physical activity performed by the user. The server 134 and/or computer 102 may award activity points to the user when the user completes certain athletic milestones. The electronic business card item display 1402B may indicate the total amount of game time and the number of items that the user has completed, and provide history information about the completed items. Virtual business card item display 1402B may also indicate the style of game the user exhibits for each item, as well as the length of the item and the date of the item. Virtual business card reward display 1402C may indicate rewards that the user has accumulated over time. For example, after a certain total amount of rise time has accumulated during the course of a project, the server 134 and/or computer 102 may reward the user for a flight club reward.
Other example awards may be "king of a court" with one or more highest indicators in a particular court, awarding "flier miles" with a mile of flight time (or other measure of time and distance), awarding "world wide weiss" when a player participates in an activity session in multiple countries, awarding "ankle destroyers" to users with at least a particular highest speed or fastest first step, "jumping king" to users with at least a particular vertical jump, "24/7 hitter" will award a series of games for a particular number of days or to users in a particular number of different court games, "icers" if a particular number of competitors follow users, awarding "blackman" to young players who achieve certain performance indicator levels if a greater number (as compared to icers) of competitors follow users, and "old" awarding "old" to older players who achieve certain performance indicator levels. Other types of rewards may also be awarded.
FIG. 15 illustrates an example user profile display of a GUI presenting a user profile according to an example embodiment. User profile display 1502 may present information about the user, such as height, weight, and location, style of play (e.g., "silent stabs"), and other information. User profile display 1502 may also indicate one or more types of footwear worn by the user. User profile display 1502 may present information about user activities and may permit a user to control the sharing of that information with other users. For example, the user may specify which other users may view the user profile information, or all of the user's information may be made accessible to any other user. FIG. 16 illustrates yet another example of information about a user that may be presented in a user profile display 1502 according to an example embodiment.
17-20 illustrate yet another example display of a GUI for displaying performance metrics to a user according to an example embodiment. During the activity period, at the end of the activity period, or both, the computation 102 may communicate with at least one of the pod sensors 304, the distributed sensors 306, or other sensors to obtain data that generates performance metrics. An example display of the GUI when data is collected is shown in fig. 17, such as the highest vertical in display 1702A, total dead time in display 1702B, tempo statistics in display 1702C, and points in display 1702D. Scroll bar 1704 represents progress in transferring data from the sensors to computer 102.
FIG. 18 illustrates an example jump display involving a vertical jump by a user, according to an example embodiment. The computer 102 may track the user's vertical jump during the exercise program, as well as the information of when the jump occurred during the program. Computer 102 may determine the vertical jump height based on the amount of rise time between when the user's feet leave the ground and when one of the user's feet subsequently contacts the ground. The computer 102 may process the accelerometer data from the round pod sensor 304 and/or the force data from the distributed sensors to determine when the user's feet are off the ground and when one foot subsequently contacts the ground. The computer 102 may also compare the user data from the pod sensors 304 and the distributed sensors 306 to the jump data to confirm that the user actually jumps and lands, rather than merely lifting their feet off the ground or hanging on a basket (or other object) for a predetermined time. The jump data may be data generated to indicate what the force profile and/or acceleration profile of the user actually jumping is. The computer 102 may use the similarity index when comparing the user data to the jump data. If the user data is not sufficiently similar to the jump data, the computer 102 determines that the user data is not in-jump data and may not include the user data when determining a performance metric (e.g., highest or average vertical jump) for the user.
If computer 102 determines that the user data is data during a jump, computer 102 may process the user data to determine the vertical jump height, the time of the vertical jump, the average vertical jump height of the user, maintain a total of jump rise times, and/or determine which foot is the leading foot, among other metrics. The computer 102 may identify the leader based on force data and/or accelerometer data associated with each shoe. The force data and/or accelerometer data may include timing information so that the computer 102 may compare events in each shoe. The computer 102 may process the force data and/or accelerometer data and timing information to determine which foot was last on the ground before the jump. The computer 102 may identify the leading foot based on the last foot on the ground when the user jumps and/or the foot associated with the user's maximum vertical jump. The computer 102 may also present a jump display 1802A that includes the user's top five vertical jumps and depicts which foot was immediately before the jump, or both feet were last on the ground. The jump display may display any desired number of the highest jumps, which may be specified by the user or set by the system 100. The number of highest hops may be based on the amount of time. For example, the jump display 1802A may present the top five jumps out of the total time of the activity slot, the top five jumps in the last predetermined number of minutes or percentage of the total activity slot time, or based on the type of activity slot (e.g., compared to a street basketball game for an organized game). The jump display 1802A or 1802B may also display vertical jumps in a duration rather than vertical jumps associated with activity periods, and the duration may include, for example, a month, a week, a full time, or other range of times. The jump display 1802A or 1802B may also present a total number of jumps, an accumulated amount of flight time, an average flight time, a flight time corresponding to the highest vertical jump, or other information about the jumps. The orientation of the computer 102 may control which of the jump display 1802A and the jump display 1802B is currently presented. For example, a user may rotate the computer 102 (e.g., 90 °) to change from presenting the jump display 1802A (e.g., portrait) to presenting the jump display 1802B (e.g., landscape). The user may rotate computer 102 in the opposite direction to present jump display 1802B to present jump display 1802A. Similarly, rotation of the computer 102 may be used to alternate between displays in other examples described herein.
In another example, jump display 1802B may display the user's jumps chronologically in the activity period, and may indicate the time each jump occurred and the vertical height of each jump in the activity period. During the activity session, jump display 1802B may also display a personal optimal vertical jump from a previous activity session or a preset user. In an example, an individual's best line may change during an activity period, either via a step function, or by adding a new best new line (e.g., a "new best" color) that supplements the existing line and shows the line of the activity period in which the new best occurs. The computer 102 may also update the jump display 1802B by replacing the previous personal-best line (e.g., in one color) with a new line (e.g., in a new personal-best color that is only used when the personal best occurs during the active period). Further, the color may change as the user's personal best improves to indicate a comparison of the capabilities with other users (e.g., your jump is higher than other 85% of users).
Jump display 1802B may include a performance zone (zone) (e.g., a basket-out zone) that indicates when the user is able to perform an action (e.g., snap a basketball into a basket). The computer 102 may adjust the user's performance zone based on the user's physical attributes (e.g., height, arm length, leg length, torso length, body length, etc.). For example, for shorter users, the dunk zone may require a higher vertical jump than for taller users.
The representation zone may correspond to a range of values, a minimum value, or a maximum value. The one or more values may be associated with when a particular action may be expected by the athletic performance of the user. For example, the performance zone may be a minimum vertical jump that permits the user to catch a basketball into the basket. The user need not actually perform the action (e.g., a basket), but instead the performance zone may indicate when the computer 102 calculates that the user may perform the action.
Based on sensor data obtained from one or more activity periods, computer 102 may provide recommendations to assist the user in reaching the performance zone. For example, analysis of sensor data associated with jumping by the user by the computer 102 may enable more feedback to the user to improve their ability to enter the basket area or to improve personal best performance in a short time (in rare air). For example, the computer 102 may process the sensor data and recommend that the user adjust certain body parts to increase the user's ability to jump. In another example, the computer 102 may suggest to the user to obtain greater acceleration of the leading foot (trailing foot) or greater pressure on the trailing foot (trailing foot) by increasing the upper body acceleration.
The performance zone may be established for any desired athletic activity. Example performance zones may correspond to a minimum amount of pressure, a maximum amount of pressure, an amount of pressure reduction within a particular range, or pressure measured by distributed sensor 306. Other example performance zones may correspond to a maximum amount of acceleration, a maximum amount of pressure, a pressure drop within a particular range or pressure measured by sensor 306. Also, the performance zone may be based on a combination of different measurements or a series of measurements. For example, the performance zone may specify at least some amount of acceleration, followed by at least some amount of rise time, followed by at least some amount of measured pressure.
In gymnastics, for example, acceleration and body rotation may be monitored. For example, during a horse-dropping from high and low bars (dismount), a gymnastic player may wish to have a certain amount of body rotation. If a gymnastic player rotates too fast or too slow, it may not position their body in the correct position when landing. The performance zone may be a "spin band" that specifies minimum and maximum spin accelerations, and during the ride down, the computer 102 may monitor for spinning at too high or too low a speed to provide feedback to the gymnastic athletes whether they are within the performance zone. When taking the horse, the computer 102 may provide recommendations to adjust certain body parts to adjust the amount of acceleration, thereby increasing or decreasing the user's rotation. The performance zones may be established for other sports (e.g., competitions and field events, golf, etc.).
The computer 102 may adjust the performance zone based on feedback received from the user. In an example, the computer 102 may receive input from the user indicating a vertical jump action that the user is able to perform (e.g., snapping a basketball into a basket), and the computer 102 may adjust the minimum required vertical jump that places the user in the performance zone based on the user's feedback. The computer 102 may award one or more activity points to the user for being in the performance zone and for the user to maintain their performance within the performance zone. While in the performance zone, the computer 102 may also determine the amount of calories burned by the user.
The computer 102 may present a rate of activity points earned by the user for the duration of the exercise activity session. FIG. 18 illustrates an example activity points display 1804, according to an example embodiment. The computer 102 may determine and award activity points to the user during the exercise activity period. To this end, the computer 102 may compare the measured user performance to any number of metrics to award activity points. For example, the computer 102 may award a predetermined number of activity points for running a predetermined distance. As can be seen in fig. 18, line 1806 of activity points display 1804 may represent activity points earned by a user at various times during an exercise activity session, line 1806 may represent an all-time average rate of activity points that the user has accumulated, line 1808 may represent an average rate of activity points accumulated by the user during a particular activity session, and line 1812 may represent an all-time optimal rate for accumulating activity points. In an example, line 1806 may represent how the activity score is accumulated by the user every minute or other time interval (e.g., every millisecond, every second, every ten seconds, every thirty seconds, etc.). The activity points display 1804 may also present indicia, such as lines, indicating other matrices (matrices), such as averages, including, but not limited to, average rates of activity points accumulated for a predetermined number of previous activity periods (e.g., the previous three activity periods). Furthermore, the lines may be of different colors. If a new full time optimal is established, the activity points display 1804 may blink or otherwise present an indication that signifies the achievement.
The computer 102 may categorize the activity performed by the user and the percentage of time the user is in a particular category during the exercise activity period and present this information to the user in the activity points display 1804. For example, the activity points display 1804 may indicate a percentage of the user's idle time, a percentage of the user's lateral movement time, a percentage of the user's walking time, a percentage of the user's running time, a percentage of the user's sprint time, and a percentage of the user's jumping time, etc., during an activity period. Other categories may also be presented instead of or in addition to that shown in the activity points display 1804. Further, for the activity statistics described above, the activity points display 1804 may display an accumulated amount of time, rather than a percentage of time. The computer 102 may determine the amount of activity points earned by the user while in each category, as well as the total amount of activity points earned during the exercise activity session, and present the information via the activity points display 1804. In an example, computer 102 may determine activity points earned 25 when the user walks, 75 when the user runs, 150 when the user is sprint, and 250 total activity points. Instead of or in addition to determining activity points, the computer 102 may also determine a calorie burning rate for each of the categories.
The computer 102 may also display performance metric data based on measurements of the user's jerk and tempo. FIG. 19 illustrates example jerk displays 1902A-B and cadence displays 1904A-B according to an example embodiment. The jerk display 1902A may present the user's jerk over time during the course of the project, as well as other performance indicators. For example, the computer 102 may track various performance metrics, including the aggregate of jumps, sprints, false actions, and jump recovery (the shortest amount of time between consecutive jumps) in the project, and jerks may be a function of these metrics. Referring to the jerk display 1902B, the computer 102 may classify jerks into three categories: low, medium and high. More or fewer categories of jerkiness may be defined. The jerk display 1902B may also present a line 1906 indicating an average jerk level in the item.
Referring to the tempo display 1904A, the computer 102 may present information of the user's tempo during the activity period. The tempo may be calculated based on a pace rate taken by the user at each time interval (e.g., steps per minute). The category may be defined by a range of step rates. For example, walking may be defined as 30 steps per minute, jogging may be 31-50 steps per minute, running may be defined as 51-70 steps per minute, and sprinting may be defined as 71 or more steps per minute. Referring to the tempo display 1904B, the computer 102 may indicate how often the user is under each category during the activity period. For example, the tempo display 1904B may indicate that the user is in a percentage of time under each category (e.g., 12% of the time is in the spurt). The cadence display 1904 may also indicate the user's fastest number of steps per second or any other time interval (e.g., 4.1 steps/second), total number of steps, total number of sprints, etc.
The computer device 102 may also notify the user of the number of activity points he or she has earned during the workout and the total number of activity points accumulated. FIG. 20 illustrates an example activity points display of a GUI notifying a user of points earned during an activity session according to an example embodiment. The computer 102 may process data taken during the exercise activity period to award points to the user. Points may track a user's activities across different periods of physical and exercise activity. The points displays 2002A-B may permit a user to determine the number of points earned by a date range, exercise activity period, or other range.
The computer 102 may also track user-defined movements. FIG. 21 illustrates an example free-style display of a GUI providing free-style user movement information according to an example embodiment. In free style display 2102A, computer 102 may prompt a user to initiate motion for tracking. The user may perform any desired type of movement, hereinafter denoted as "free style" movement. In free style display 2102B, computer 102 may display the user's vertical jump, dead time, and feet used for jumping during the free style exercise. The free style display 2102B may display performance metrics that are deemed relevant by the system 100, by the user, or by both. For example, the performance metrics may be vertical jump, dead time, feet as shown in display 2102B, may be weight distribution as shown in display 2102C, or both may be toggled by the user. In the free style display 2102C, the computer 102 may display the weight distribution measured by the distributed sensor 306. The user may also review the change in weight distribution over time to determine how the user's weight distribution affects the user's usability of moving or jumping. The user may, for example, slide their finger across the display screen to move between the displays 2102A-C.
In addition to monitoring the user's performance during the activity period, the computer 102 may help the user improve their athletic skills. FIG. 22 illustrates example training displays 2202A-B presenting user selectable periods of training activity, according to an example embodiment. The training session may guide the user through a set of designed movements to improve the user's athletic ability. Example training items may include shooting exercises, all around the world game, professional player games, base games, hang time games, continuous hip ball (crossover) games, penalty basket balance games, sign move (sport) games, professional opponent games, and horse jump (horse) games. These training activity periods are further described in fig. 23-26. For example, the computer 102 may have a touch screen that permits the user to scroll and select between the training activities shown in FIGS. 23-26.
In other embodiments, one or more non-transitory computer-readable media may include computer-executable instructions that, when executed by a processor, permit a user to participate in challenges and/or games with one or more local and/or remote users. In one embodiment, the display device may be configured to present one or more sports motions to the user. The athletic activity may include skateboarding, which together form a "trick" (such as, for example, one or more combinations of posterior rib (rail) sliding, anterior retrograde sliding, and/or "trick" that may be performed by the user). In one embodiment, the challenge and/or game may require the user to perform at least one skill. Some embodiments may resemble a game like a HORSE (HORSE-like) generally known to athletes, particularly in the field of basketball, wherein the performance of a successful shot into a basket (aka, a shot score) by a first user is awarded a symbol (e.g., a letter of a word). For example, a skill, or a portion of a skill, may be awarded to the user the word S in SKATE. In a variant, the skill successfully completed by the first user may indicate what the second individual or group of individuals must do to obtain the same symbol reward. Some embodiments may require at least a second user to perform the same skill or sport (or within a threshold level for that sport or skill). In one embodiment, if a first skateboarder (skater) completes a first combination of movements (e.g., movements that together form a skill, such as, for example, a forward side slip), performance characteristics of those movements or a portion thereof may be detected or measured (qualitatively and/or quantitatively). Example performance characteristics may include, but are not limited to, speed, acceleration, position, rotational force, height of the skateboard, and/or the user or portion of the user. In some embodiments, the user's successful skill in completing a game, such as skating, may be based on one or more values in the performance characteristics. In one embodiment, the at least two performance characteristics may each have to satisfy a threshold, while in another embodiment, an overall threshold from a plurality of individual values may have to be obtained regardless of the individual values. In another embodiment, a user may not be permitted to act or skill to earn points in a game, challenge, event, etc., unless the user previously completed sub-components and/or less complex movements. For example, in one embodiment, such as in a skateboard game, a first individual may not challenge a second individual to obtain letters in SKATE, which the user has not previously performed by playing, which may be determined, for example, by the same sensor (or at least a portion thereof) used in the current challenge or game. For example, in certain embodiments, to attempt a first skill, such as surfing on the front side of the reverse 360 (Ollie), the user may have to have completed a section (such as measured by one or more sensors or processes discussed herein or known in the art), such as a board jump and/or a rolling shuffle (rolling fakie). Some embodiments may require completion within a set timeline, such as within the past month, or within a threshold amount (e.g., at least 5 successful performances), or a combination thereof. In one embodiment, the first user may select a performance metric that the second user must meet. The first user may be required to indicate the performance characteristic prior to the performance of and/or recognition of the skill. For example, the first user may identify the trick as being coasted 360 at front-side faring, and identify at least one of the performance metrics as altitude or hang time. Thus, a second user who successfully completes a skill may have to complete a certain height or dead time (at least that completed by the first user or within a range), which may be default or settable. Other examples of challenges, and implementations of example challenges, are discussed in more detail throughout this disclosure, including the following. In one embodiment, the game or challenge may be performed by a single user. For example, the systems and methods may prompt a user to perform a certain skill or a certain series of skills, where each skill, part of a skill, and/or series of skills may be assigned a symbol, such as a letter of a word, that is awarded to the user. For example, the user may perform a first trick to obtain the letter S in the word SKATE. According to one embodiment, the challenge may include a plurality of individual skills, and in still other embodiments, the challenge may include a plurality of skills. In various embodiments, the challenge may include multiple techniques, each performed in a sequential manner. In still other embodiments, the tricks may be formed relative to each other during predetermined time intervals, such as the user having to transition from a first trick to a second trick. In some embodiments, the particular letter or symbol awarded may be based on a particular skill, such as, but not limited to, those described below or elsewhere in this document.
In one embodiment, the challenge may be related to a specific skill and/or a specific skill set. For example, the first challenge may prompt the user to perform any "FAKIE" type skills. In one embodiment, the user may be presented with at least one image of an athlete, which may be professional or amateur, performing at least a portion of the challenge. In at least one embodiment, multiple sequential images, such as videos, may be provided. In some embodiments, the image data may also be viewed without accepting a challenge.
In some embodiments, the user may receive a challenge based on location. In one of these embodiments, the challenge may be related to a particular skill and/or a particular set of skills. For example, when reaching a particular location or venue (venue), the user may be prompted to perform one or more challenges. In one embodiment, the user may be prompted to challenge a specific or unique challenge of the venue at which the challenge is to be made. For example, when a user arrives at a first meeting place (e.g., a skateboard park), the first challenge may prompt the user to make a particular "reverse" type challenge. When the user arrives at the second venue, the second challenge may prompt the user to perform a particular "GRIND" type skill. Challenges and/or specific skills may be associated with a particular venue based on a variety of factors including, but not limited to, the landscape, elevation, positioning, and some physical object (e.g., ledge, rail, step) of the venue, etc. Thus, when a user arrives at a particular venue or location, the user may be prompted to engage in one or more challenges (and/or specific skills) associated with the particular venue or location. In some embodiments, after one or more challenges are performed at the first venue, the user may be prompted to perform additional challenges at the second venue. Other embodiments may recommend a particular venue, location (or time at a particular location) to perform at least one challenge or skill, such as described herein. The skills may include a number of skateboard skills that require different physical objects. For example, multiple techniques may require a horizontal surface, ledge, or railing located at a particular height (or range of heights) from the ground surface.
In embodiments that utilize location data to determine the user's location, GPS data may be used to determine when the user has arrived at or departed from a particular venue or location. For example, a mobile phone or other device may periodically analyze GPS data to determine when a user has left a skateboard park or other venue. When a challenge is selected, the GUI may be updated to inform the user of the opportunity and location to participate in the challenge. For example, a computing device, such as computer 102, may communicate (e.g., GPS data) the user's location information to a server, which may respond by identifying nearby venues or locations where the user may make various challenges or skills.
In certain embodiments, the plurality of sequential images, which may be referred to herein as trick videos (TTVs), may include a plurality of images acquired from a plurality of perspectives, such as a plurality of sequential images taken from different perspectives. For example, in one embodiment, the first perspective may include a first plurality of images taken at a first angle (such as 25-30 degrees from a ground surface, such as a cement surface across or activated by a player) during a first time frame. Those skilled in the art will appreciate that the ground surface is planar, but may include a variety of angles and/or extensions projecting from the surface, such as railings, stairs, pipes, and the like. The second perspective may include a second plurality of images taken at a second angle (such as 30-35 degrees from the same ground surface) during a second time frame, which may or may not include a portion of the first time frame. In one embodiment, the first and second perspectives may be taken at different angles along the same horizontal or vertical axis. In one embodiment, the first and second timeframes entirely overlap, and thus permit the user to view the same trick (or portion of a trick) from multiple angles.
Data (raw or processed) relating to physical activity may be obtained directly or indirectly, and/or may be derived from one or more sensors, including those described herein. According to some embodiments, the physical activity data may be overlaid on images (or sequences of images, e.g., videos) of the user captured during performance of the physical activity, such as a skateboarder (which may be the user 124 shown in fig. 1).
In one embodiment, the user can adjust the video, either during playback or via trick play commands, to adjust the viewing angle. In one embodiment, a user may be permitted to provide one or more user inputs to adjust a perspective of a video, such as to select one of a plurality of views or perspectives. For example, during one or more portions of a video, a user may wish to see an athlete's foot in an up-down view, may be a skateboard; and again it is desirable to see the side perspective view during the same or another portion of the video, such as to better observe at least a portion of the athlete's rotation during the performance of the skill.
In another embodiment, the user may adjust the video, either during playback or via trick play commands, to adjust the frame rate and/or playback speed. In one embodiment, a user may be permitted to provide one or more user inputs to adjust the frame rate and/or playback speed of the video in a desired manner, such as to provide a "slow motion" effect when viewed on a display. Fig. 43 provides an example UI configured to permit a user to acquire image data at various frame rates. The UI shown in fig. 43 may include a UI element configured to permit a user to acquire images at a first frame rate ("normal speed"). For example, fig. 43 shows a UI 1000 configured to permit user input from a user, such as selecting a UI input element (shown as a soft button 1002, located on the right, middle side of the UI 1000). UI 1000 may provide image data, such as real-time action image data of a user performing a skill. This may be set to occur before the user acquires image data using UI 1000 (e.g., activates soft button 1002). In some embodiments, the image data shown within the UI may be analyzed, such as to perform or assist in auto-focus, measure distance, adjust brightness, and/or other actions. In one embodiment, what may happen prior to acquiring images at the first frame rate is that via the input element (e.g., soft button 1002) and a different and separate trigger event must be detected or confirmed. In another embodiment, then no user input, such as via a user input element, is required, but the occurrence of image acquisition at the first frame rate may be based on a trigger event other than directly indicative of user input at the first frame rate.
A UI, such as UI 1000, may provide a marking (visible, audible, and/or tactile) that image capture (such as in response to a user activating a UI input element-soft button 1002) has occurred. In one embodiment, the same user-selectable UI input element, e.g., soft button 1002, may be configured to provide a label. For example, the soft button 1022 may be configured to blink, flash, or otherwise change its visual appearance to the user based on the acquisition of data at the first frame rate being activated.
Another UI element may permit a user to select a different frame rate to acquire at least a portion of an image. For example, as shown in fig. 44, the UI 1000 may have a "slow motion" element that may be activated or otherwise selected during the acquisition of images at a first time rate (e.g., normal speed or frame rate). As one example, the user-selectable UI input element 1004 may be a soft button that may be activated by a user touching a corresponding location on a touch screen. The element 1004 may be configured to only appear when the element 1002 is active and/or when an image is currently acquired at a particular frame rate (such as a first frame rate). The input mechanism that selects or activates the second frame rate (which may be the same as the input mechanism that selects the first frame rate, or alternatively they may be a different separate user input mechanism however, for clarity of this disclosure, the input mechanism that selects the second frame rate will be referred to as the second UI input element.
Activating a second UI input element, which is a "slow motion" element, may be configured to acquire images at a second frame rate that is higher. For example, in one embodiment, the first frame rate may be 30fps, and the second frame rate may be 60fps. The images may be collected such that a single file contains images acquired at multiple frame rates, such as at a first frame rate and a second frame rate. As will be explained subsequently, the file of image data may be configured such that sequential playback, such as playback via the UI 1000 or any other interface (e.g., a display associated with the computer 102), is configured to provide an appearance that images acquired at the second frame rate are in a slower motion than images acquired at the first frame rate. For example, in one embodiment, playback may occur at a constant frame rate, which may or may not be the first frame rate. Thus, if the first series of images is acquired at 30 frames per second and the second series of images is acquired at 90 frames per second, then playback of 30 images acquired at the first frame rate at 30fps will take 1 second; however, capturing an image at 90fps per second would take 3 seconds to playback at 30fps, thus providing the appearance of slow motion.
In one embodiment, a slow-motion element (e.g., element 1004) may be associated with a timing mechanism or function configured to cause a timer to be displayed on the UI 1000, during acquisition and/or after during editing or playback. The timer may be independent of the total duration of the captured video. For example, an image may have been captured a few seconds before user input to initialize the timing mechanism is received through the corresponding UI element. The "slow motion" acquisition may be invalidated, for example, by user selection and/or default values. In one embodiment, this feature is automatically deactivated once the user no longer presses or otherwise selects element 1004. For example, as shown in fig. 45, user selection of the "slow motion" element, illustrated as a soft button, causes images to be acquired at the "slow motion" frame rate, however, once the user no longer presses the soft button, acquisition of images may occur at a different frame rate. In one embodiment, the frame rate may be returned to the first frame rate (e.g., a default "normal speed" frame rate). UI 1000 may permit the user to stop the capture of images through user input.
In one embodiment, selection of one or more input mechanisms may cause a cessation (cessation) of acquiring any images at any frame rate. For example, selection of the UI element 1002 may stop image acquisition within a file that includes image data acquired at both the first and second frame rates. After the image is acquired, the entire collection of images may be observed within a UI, such as UI 1000. As illustrated in fig. 46, in one embodiment, the captured images may be associated with a timeline. Portions of the timeline (e.g., element 1006) representing images acquired at a "slow motion" frame rate may be highlighted or otherwise displayed to distinguish them from images acquired at a "normal speed" frame rate. In some embodiments, the UI may permit editing of the captured images. The UI may include a selectable play element (i.e., element 1005) that allows the user to begin playing the captured image (e.g., video).
The UI may further permit the user to view each of the images, including in a sequential manner. For example, as illustrated in fig. 47, the user may slide on the touch screen in a first direction (e.g., to the right) to view the previous sequential images and in a second direction (e.g., to the left) to view the sequential images. As illustrated in element 1008 in fig. 48, the UI may permit the user to indicate the boundaries of the cut function using a token. The UI may also include a selectable timer display element (i.e., element 1009). In some embodiments, as illustrated in fig. 49, activating the timer display element may cause the presentation of timer markings (e.g., slider bars 1010a and 1010 b) on the UI. As further illustrated in fig. 50, the user may adjust the position of the slider bar to mark the beginning and end of the timing function. For example, the user may want to display the respective times of the portions of the images that are cropped. The UI may permit the user to save the cut snippet (footage). The segment may or may not be saved with a timer, configured to be displayed during the selected portion.
In one embodiment, the output of the systems and methods described herein comprises a single electronic file containing image data (which may be or may include pixel data) representing a first series of sequential images acquired at a first frame rate and a second series of sequential images acquired at a second frame rate. The single file may be stored and/or configured to be played such that images acquired at the second frame rate are displayed such that they appear to represent slow motion. It can be appreciated that one aspect of the present disclosure relates to a single UI, allowing a user to capture a first set of sequential images. The UI may be configured to acquire image data such that at least a portion of the first set of images comprises a first series of sequential images acquired at a first frame rate and a second series of sequential images acquired at a second frame rate, wherein the acquisition is user selectable. The user selection may occur as images are acquired, such as by activating a UI input element to acquire images at the second frame rate. In other embodiments, the images may be acquired at a first frame rate that is faster than the second frame rate. Then after acquisition, the user may provide user input to adjust the frame rate of the acquired images at the faster frame rate so that they are marked, or even permanently changed, to be displayed at the slower frame rate during playback. For example, images may be acquired at a first frame rate of 120 frames per second, and a user may provide user input (or an automated process may perform an action to achieve the same result) to designate certain images at 30 fps. For example, every fourth image of the images acquired at 120fps may be utilized. Thus, during playback, the marked or altered imagery can be played, such as to establish an appearance of normal speed, while unaltered imagery (acquired at 120 fps) is played at a 30fps frame rate, thus establishing an appearance of slow motion.
In other embodiments, one or more non-transitory computer-readable media may include computer-readable instructions that, when executed by a processor, permit a user to upload, or share, data of their performance (such as video) in a manner that allows at least one third party access to the image data. In one embodiment, a computing device associated with a user, such as computer 102, may transmit image data (e.g., video) of their performance and/or corresponding athletic activity data to a display device. For example, the computer 102 may wirelessly communicate image data of their performance and corresponding athletic activity data to a display device via bluetooth or some other near field communication technology. In another embodiment, the image data (and/or physical activity data) may be transmitted in real-time.
One or more images (with corresponding activity data) may be displayed on one or more display devices, such as a display at a location of a physical activity (e.g., a skateboard park), a display at a retail location, or any other display media, including, but not limited to, multi-cast to multiple display devices. The images (and associated activity data) may be viewable via a television, a computing device, a network interface, and/or combinations thereof. For example, if a user enters a retail location, a third party may access available image data associated with the user such that the image data is displayed on one or more devices of the retail location. In one example, image data of a user performing one or more physical activities (e.g., skills or challenges) may be uploaded and displayed on a display device. The image data displayed on the display device may be uploaded by a computer associated with the user (e.g., computer 104) to a server (e.g., server 134) or some other location, such as a data sharing site. An example of an input or file sharing site may be
Figure RE-GDA0003956018960000361
(www.youtube.com)、
Figure RE-GDA0003956018960000362
Com/plus), and/or Facebook (www.facebook.com). Those skilled in the art will appreciate that these sites are merely exemplary and that other locations configured to permit the collection and downloading of electronic information are also relevant to the present disclosure.
In some embodiments, a user (e.g., user 124) and/or other individual may selectively determine which images and/or activity data are displayed on one or more display devices. The display of any data (and/or the selection of which physical activity data and image data are displayed) may vary, depending on one or more variables; including, for example, the user's location, the user's current activity score, the user's selection or input, the viewer's input, an indication of the user's performance has met a threshold; for example, a performance zone is reached, and/or combinations thereof. Based on one or more computer-readable instructions on a non-transitory computer-readable medium, other embodiments may determine which image data and/or activity values may be displayed to the viewer(s) for a particular time period, as well as the duration of time for which such data is displayed.
As other examples, data transmitted by the computer 102 may be used by a remote system to trigger an audio or video display containing user-specific information or other triggered information for comparing a user's performance metrics with other individuals according to example embodiments. Such information may be displayed at a retail location, a skateboard park venue, or other location. The data transmitted by the computer 102 may include athletic performance information associated with the user or other users, which may be used to generate a leaderboard. For example, a display located at a skateboard park venue or retail location may provide a leaderboard for comparing the user's performance to friends, selected professional athletes, or all other users including professional athletes. An example leaderboard may be a leaderboard of the highest number of activity points (or activity scores), total skills performed, total challenges performed, total rewards earned, or for other performance metrics. Other example leaderboards may be a top number of reviews or "likes" of videos associated with a user or a skill/challenge performed by a user. In this example, if a user receives multiple positive comments (or "likes") corresponding to a particular skill (or other athletic activity) performed by the user, the user may receive a better location or ranking on the leaderboard.
In other embodiments, one or more non-transitory computer-readable media may include computer-readable instructions that, when executed by a processor, permit a user to associate one or more other users with particular image data (e.g., video). For example, a first user may acquire image data via computer 102 via which a second user performs athletic activities (e.g., skills). The computer 102 or user may assign labels to the captured image data. When a first user wishes to store (or save) captured image data, the UI may prompt the user to generate a tag to associate a second user (or other user) with the captured image data. The user may be prompted in any manner to select a label for the acquired image data. One such way in which a user may be prompted to select a tag may be for the UI to be displayed to a list of other users who may be "tagged" (e.g., associated with) in the captured image data. In other embodiments, the user may manually select one or more users to associate with the acquired image data. In some embodiments, after a user has been "tagged" (e.g., associated with) particular image data, the tagged user may then declare "ownership" of the image data, such as the person having the joint or separate rights to edit, delete, copy, alter, and/or control access or editing rights of others, including the individual that captured the image data. Some example embodiments are discussed below.
In one embodiment, after the first user associates the second user with the captured image data, the first user may no longer be associated with the captured image data such that the first user has a limited number of functions or options that are performed via the UI with respect to the captured image data. For example, a first user may have the option to edit and/or delete the captured image data, associate the captured image data with one or more users, upload the image data to a server of a file sharing site, and many other options before designating a second user in the captured image data. After the second user has been designated (e.g., associated with) the captured image data, the first user may no longer perform one or more functions or options previously available to the first user via the UI (e.g., designating the user, uploading the image data, editing the image data, etc.), however, the second user may now access these features as well as options related to the captured image data. Marking image data permits a user to assign "ownership" of the captured image data, the particular device (or owner thereof) used to capture the image data. Thus, for a user, there is no longer a need to use their own image capture device to capture athletic activity, but instead ownership of image data captured by other individuals may now be claimed as if the user had actually captured image data themselves on their own image capture device.
In certain embodiments, one or more non-transitory computer-readable media may include computer-executable instructions that, when executed by a processor, permit a user to associate location data with acquired image data. For example, a first user may acquire image data via the computer 102 via which a second user performs an athletic activity (e.g., a skill). The computer 102 or user may assign location tags to the captured image data. When the first user wishes to store (or save) the captured image, the UI may prompt the user to generate a location tag to associate the captured image data with location data corresponding to the location at which the athletic activity was performed. The user may be prompted in any manner to select a location tag for the acquired image data. One such way in which a user may be prompted to select a location tag may be to display a geographical map to the user and prompt the user to identify their current location. In other embodiments, location data (e.g., GPS data) may be used to generate suggested or recommended location tags. In some embodiments, a user (e.g., user 124) may selectively determine whether location data may be shared with or made available to other users or groups of users. For example, the first user may adjust one or more UI preferences or settings such that any image data (e.g., video) location data for association with the first user is only available to friends, or other users or groups of users already identified by the first user.
Other embodiments may test a user's skill level across different skill sets and provide analysis of the user's performance. For example, a pie chart may easily verify that the user is specifying a number of actions (manual), a slide on the front side rail, and/or other actions. Additionally, other can prove proficiency in an environment, such as a sports field, skateboard park, etc., or a transition, such as from a ground surface to a railing, or from a railing back to a ground surface. The proficiency level of the user may be established based on certain challenges. FIGS. 51-91 illustrate tree diagrams of various tricks that can be performed by a user according to various embodiments. For example, FIG. 51 shows an example "skill tree" for skills related to a user performing "front-side" conventional type skills along a surface or flat ground. As will be appreciated, the "front" element of the skill tree illustrated in fig. 51 refers to the direction of rotation of the user during the skill. As illustrated in fig. 51, various hubs (hubs) are included in the skill tree that identify various branches (components) and sub-branches (sub-components) of the flat ground routine skills that may be performed by the user with a front-side rotation. The first branch of the skill tree in FIG. 51 is soaring (OLlie). The second branch of the skill tree is the swivel hinge. The rotation hinge identifies a skill incorporating a rotation (or spinning) angle. The rotary hinge part is subdivided into two sub-branches (e.g. skills): a front side 180 and a front side 360. The third branch of the skill tree is the tip-flip (kickflip) hinge, which identifies the various types of tip-flip skills that can be performed by the user. As illustrated in fig. 51, some of the sub-branches (e.g., tricks) of the tip-flip hinge also include rotational branches (e.g., anterior 180 tip-flip and 360 tip-flip). The fourth branch of the skill tree is the heel-flip (heilflip) hub, which identifies the various types of heel-flip skills that can be performed by the user. The last branch of the skill tree is a boomerang (cover-it) hub that identifies the various types of juxtapose skills that can be performed by the user.
In some embodiments, the UI may display a skill tree as shown in fig. 51 so that the user may identify one or more various skills to proceed. In other embodiments, the UI may provide an indication on the skills showing one or more skills in the tree that the user has previously performed. As another example, fig. 52 shows an example "skill tree" for a user performing "back side" conventional type skills along a surface or plane. As will be appreciated, the "back" element of the skill tree illustrated in fig. 52 refers to the direction of rotation of the user during the skill. As illustrated in fig. 52, various hubs are included in the skill tree that identify various branches and sub-branches of flat ground conventional skills with back-side rotation that can be performed by the user. Fig. 53-58 illustrate various other skill trees for flat ground techniques, including transposing (switches), backsliding, and backsplash (nollies). FIGS. 59-74 illustrate various other skill trees that may be performed on a particular surface, such as a ledge, rail, or tube. Fig. 75-91 illustrate a skill tree of various other skills, such as still skills, slide and grind, and skills that may be performed when the user is idle. Those skilled in the art will appreciate that a skateboarder may "mill" on a surface (e.g., a metal pipe) such as at least one of the bridges (taps) and a user may perform a "sliding" technique in which the user slides on the surface in such a way that the plate or an extension of the plate (other than the bridge) may contact the surface. For example, a first letter may require the user to perform a milling skill, and a second skill may require that the milling skill be a front side milling skill. Still other letters or symbols may require the user to perform one or more specific types of back side milling skills. Other embodiments may require the user to perform a specific milling skill, followed by a gliding skill, within a specific time frame or transition period.
The one or more rewards may be related to the performance of the user. As discussed below, the user's performance may be ranked by one or more third parties, including defined communities, individuals, friends, colleagues, and combinations thereof. The reward may be set based on style, user control, and/or effectiveness. For example, control may be based on one or more sensor inputs, e.g., changes in the weight of the user and/or forces measured across one or more axes may be measured and compared to predetermined and/or desired ranges. In yet another embodiment, the impact effect of the user may be measured. The sensor may measure a force, such as, but not limited to, the sensors described herein. Some embodiments may prompt the placement and/or location of one or more image capture devices and/or other sensors. For example, the instructions may be arranged to facilitate the acquisition of a plurality of images from a plurality of perspectives. In still other embodiments, the image data may be available from multiple perspectives without requiring the user to acquire the image at a particular location or the like.
Some embodiments may permit users to upload, or otherwise allow data (such as video) they represent to at least one third party access the image data. In this regard, certain embodiments relate to relevant image data regarding physical activity, such as including, but not limited to, raw data and/or processed data disclosed in any of the embodiments disclosed herein. Data (raw or processed) relating to physical activity may be obtained directly or indirectly, and/or may be derived from one or more sensors, including those described herein. According to some embodiments, the physical activity data may be overlaid on images (or sequences of images, e.g., videos) of the user captured during performance of the physical activity, such as a skateboarder (which may be the user 124 shown in fig. 1). Examples are provided below, including, but not limited to, fig. 42.
Other embodiments may allow one or more users to challenge other third parties, including, but not limited to, members of the community. In some embodiments, a user may challenge one or more members of the defined community. In some implementations, the user may issue and/or receive challenges based on skill, experience, location, age within the community, age, and/or combinations thereof with other factors. Other embodiments may analyze motion within the image data. Example image analysis may include, but is not limited to, one or more of the systems and methods described in U.S. patent application 61/79,372.
One or more techniques, which may be measured directly and/or indirectly from images and/or other sensor data, may be utilized to formulate and provide recommendations for users, such as footwear, apparel, equipment, such as skateboards, bridges, ribs (rails), and/or other products. For example, the tendency of a user to rotate left or right during the performance of one or more skills may be utilized to recommend a particular shoe, skateboard, and/or combination of the two. As another non-limiting example, recommendations for footwear may be formulated based on the user's compliance, skill level, performance, and/or other factors that may be measured by the sensors, such as force, impact, acceleration, and/or image data. For example, footwear may be selected based on cushioning, flexible regions, and/or support structures, and other embodiments may consider threshold levels of protection against impact forces detected during performance of one or more skills of a user.
Other embodiments may recommend a particular location (or time at a particular location) for performing at least one skill (such as described herein), including, but not limited to, those discussed in fig. 37 and 38. Although these figures are described in the context of a basketball court, those skilled in the art will appreciate that any environment is within the scope of the present disclosure. Other embodiments, for example, subject to successful completion of one or more skills by a user (which may be based on sensor and/or manual analysis), may unlock the ability to establish tangible goods (tandable foods). For example, successfully completing the skill tree may unlock the ability to build a custom t-shirt that includes an image of the user performing at least one of the skills and/or data from the performance of one of the skills. As the user progresses and/or progresses with respect to skill and/or skills, other features and/or capabilities may be unlocked or otherwise made available.
Although examples of the present disclosure are provided with respect to skateboards, one skilled in the art will appreciate that any of the teachings herein will apply to any athletic activity, such as, for example, basketball. 27-30 illustrate display screens of a GUI for a basketball shot training session according to an example embodiment. In fig. 27, the training display 2702 may present the user with information of their last activity period (e.g., penalty basket, three goals, and percentage of hits in a jump shot) and prompt the user to begin a new activity period. The computer 102 may monitor touches on the pressure sensitive display screen to track success and failure. To this end, the computer 102 may monitor how many fingers are used to distinguish between basketball shots. For example, three fingers may be used to indicate three-point shots in a basketball, two fingers may be used to indicate two-point shots, and a single finger may be used to indicate a penalty shot, as shown in fig. 28. A tap of one or more fingers on the display screen may indicate a successful shot, and a re-tap of one or more fingers across a portion of the display screen may indicate a failure. In other examples, a swipe down using one or more fingers across the display screen of the computer 102 may indicate success, and a swipe up using one or more fingers may indicate failure.
The computer 102 may process the user input to determine the number of fingers used between the tap and the re-tap. The computer 102 may determine the amount of area of the display screen that is covered by the finger when tapping and/or swiping the display screen to distinguish between one, two, or three fingers. The computer 102 may also determine the duration of the touch and whether the section of the display screen initially contacted by the user is different than the section of the display screen at the end of the touch to distinguish between a tap and a re-tap. At the end of the activity period, the training display 2702 may display success and failure information to the user, as seen in FIG. 29. The training display 2702 may display success/failure by shot type, as well as the total success/failure for all shot types. For example, training display 2702A may display success and failure of the penalty basket, and training display 2702B may display success and failure of the jump shot. The training display 2702B may total two and three-point basketball shots and may display success and failure together, or a separate display may present the success and failure of each type of shot.
FIG. 30 illustrates an example display of a GUI that provides a user with shot practice activity period information according to an example embodiment. The shot total display 3002A may permit the user to select all shots or a particular shot type to receive information of the percent success of a shot (e.g., 55.6%), how many shots were successful in succession, and the vertical jump "most efficient shot point" at which the user successfully shot. The most efficient shot point may indicate a vertical jump where the user's percentage of shots (e.g., percentage of successful shots) exceeds a predetermined amount (e.g., 50%). The computer 102 may process data from the pod sensors 304 and/or the distributed sensors 306 to provide information to the user via the GUI regarding their success and failure. This information may include average vertical jumps for success and failure to inform the user about how jump height affects his or her shot performance. The total shot display 3002B may inform the user which foot was used when jumping as part of a shot, the height to jump from vertical, and whether the shot succeeded or failed. The shot total display 3002C may provide the user with information regarding the success and failure of the tee.
The shot total display 3002 may provide the user with statistical information on how their balance affects their shooting aspects by indicating how many balanced shots were successful and how many unbalanced shots were successful. When the user makes a shot, the computer 102 may determine a balance based on the weight distribution measured by the distributed sensors. If the weight is distributed relatively evenly (i.e., within a certain threshold) between the user's feet, the computer 102 may identify the shot as balanced. When the weight is not relatively evenly distributed between the user's feet (i.e., outside of a certain threshold), the computer 102 may identify the shot as unbalanced. The shot total display 3002C may also provide the user with feedback on their balance and a prompt to correct any problems with unbalanced weight distribution. For example, field 3004 may indicate how many shots succeeded when the user's weight is balanced, and field 3006 may indicate how many shots succeeded when the user's weight is unbalanced.
In an example, the computer 102 may receive and process data generated by the force sensors to determine weight distribution during performance of an exercise task (e.g., making a jump shot in basketball). The computer 102 may process user input indicating that an exercise task (e.g., a shot was successfully completed). The weight distribution of the computer 102 prior to user input may be associated with indicating successful completion of the exercise task. For example, the computer 102 may process the sensor data to identify movements consistent with a shot, and determine weight distribution and detect when to take off, periods before taking off and landing, and periods after landing during a jump shot. The computer 102 may monitor the weight distribution during these periods. At a subsequent time (e.g., a second or subsequent jump), the computer device 102 may process additional user input indicating unsuccessful completion (e.g., failure) of the exercise task. The computer 102 may associate the weight distribution detected at a time prior to the user input with unsuccessful completion of the exercise task. After or during the exercise activity period, the computer device 102 may present the user with information about their weight distribution and how the distribution affects the user's ability to complete the exercise task.
The GUI may also provide incentives for the user to continue their basketball shooting task. FIG. 31 illustrates an example display of a GUI notifying a user of a shooting milestone, according to an example embodiment. Milestone display 3102 may inform the user of one or more shooting thresholds and how many shots the user has succeeded. For example, milestone display 3102 may indicate that the user has succeeded 108 shots, such that the user has reached an amateur state, and an additional 392 shots need to be successfully made to reach the next state level.
As part of the training to improve the user's skills, the computer 102 may prompt the user to take an action similar to that used by professional athletes. FIG. 32 prompts an example signage campaign display of a GUI for prompting a user to conduct a signage campaign workout that mimics a professional athlete, according to an example embodiment. In addition to professional athlete signage motions, a user can create and share signage motions with other users.
In an example, a user may enter a search query into the sign action display 3202A to initiate a search for a desired professional athlete. The computer 102 may forward the search query to the server 134, which may reply to the user query results. The server 134 may also provide suggested sign actions to the computer 102 for display prior to the user entering a search query. As seen in the sign action display 3202A, the computer 102 may display different sign actions for user selection. Upon selection of a particular action, the sign action display 3202B may present a video of the sign action and provide performance metrics for the professional athlete of the action. For example, the computer 102 may query the server 134 for sign action data in response to a user's selection to generate a sign action display 3202B. The signage motion data may include data from professional players performing the signage motions from the pod sensors 304 and the distributed sensors 306. The user may attempt to mimic the sign action, and the computer 102 may process the user data to indicate the accuracy of the mimicking.
After the attempt at the sign action is successful, the computer 102 may inform the user how successfully they have mimicked the movement. To identify a match, the computer 102 may compare data obtained from the pod sensors 304 and/or the distributed sensors 306 to sign action data to determine that the two are similar. The computer 102 may monitor how long the user spends completing a signage campaign, the user's vertical jump, the user's dead time, the user's cadence, or other information, and compare this data to corresponding data from professional athletes. The computer 102 may also indicate how accurately the user has mimicked the professional athlete's sign action, as shown in sign action display 2302C. Accuracy may be determined based on a combination of how similar each of the performance metrics is to the metrics of the professional athlete. The computer 102 may weigh some metrics more than others, or may weigh each metric equally. For example, the sign action data may provide information for three different metrics, and the user's data may be compared to each of the three metrics. The computer 102 may determine a ratio of the user performance metric to the professional athlete's metric and may identify a match if the ratio is above a threshold (e.g., approximately 80%). Accuracy may also be determined in other ways.
In an example, the computer 102 may receive sign action data corresponding to measurements of acceleration and force measured by a first user (e.g., a professional athlete) performing a series of exercise tasks (e.g., snapping a basket off in a basketball sport). By monitoring a second user attempting to perform the same sequence of exercise tasks, the computer device 102 may receive and process user data generated by at least one of the sensors 304 and 306. In turn, the computer 102 may generate a similarity metric that indicates how similar the user data is to the sign action data.
As part of the social network, the computer 102 may also provide the user with data from other users and/or professional athletes for comparison. FIG. 33 illustrates an example display of a GUI searching for comparisons of performance metrics by other users and/or professional athletes according to an example embodiment. The computer 102 may communicate with the server 134 to identify professional athletes or friends of the user, as visible in the display 3302A. Each individual may be associated with a unique identifier. For example, the user may select to add a friend or professional, as seen in the GUI display on the left. When the user selects to add a friend/professional, the user may enter a search query into computer 102 for communication to server 134, which server 134 may reply with people and/or professional athletes matching the search query, as shown in display 3302B. The user may create user profiles to identify their friends and/or favorite professional athletes so that computer 102 may automatically load these individual profiles, as shown in display 3302C.
The computer 102 may present data for sharing with friends and/or posting to social networking sites. In fig. 34, for example, the display 3402A provides information for sharing, including points, vertical jumps, total dead time, and highest tempo. Display 3402B, for example, provides a side by side comparison of the performance metrics of the user and the specified friends. In an example, the server 134 may store performance metric data for each user and may communicate the data with the computers 102 of other users that require the data to be obtained.
FIG. 35 illustrates an example display for comparing a user's performance metrics with other individuals according to an example embodiment. For example, the display 3502A may provide a leaderboard for comparing the performance metrics of the user with friends, selected professional athletes, or other users including professional athletes. An example leaderboard may be a ranking for highest vertical, highest tempo, total dead time, total games played, total prizes won, or other performance indicators. Display 3502B permits the user to view individuals whose performance indicators indicate that they are or are not in a performance zone (e.g., a belt buckle). Computer 102 may also permit users to compare their performance metrics to a particular group (e.g., friends) or to all users. The foregoing discussion has primarily referred to basketball, but the examples above may be applied to other team sports as well as personal sports, such as skateboarding.
Fig. 36 illustrates a flowchart of an example method for determining whether obtained physical data monitoring physical activity performed by a user is within a performance zone, according to an example embodiment. The method of fig. 36 may be implemented by a computer, such as, for example, computer 102, server 134, a distributed computing system, a cloud computer, other devices, and combinations thereof. The order of the steps shown in fig. 36 may also be rearranged, additional steps may be included, some steps may be removed, and some steps may be repeated one or more times. The method may begin at block 3602.
In block 3602, the method may include processing input specifying user attributes. In an example, the computer 102 may prompt the user to enter one or more user attributes. Example user attributes include height, weight, arm length, leg length, arm spread, and the like. In an example, the user may specify their body length. Body length may be a measure of how high a hand can reach when the user holds the foot on the floor opposite the hand.
In block 3604, the method may include adjusting the presentation area based on the user attribute. In an example, the computer 102 may adjust the performance zone for how high the user must jump to dunk the basket based on the user's height, arm length, torso length, and leg length. For users of relatively high height, the performance area may specify a lower minimum jump height to catch a basketball into the basket as compared to the minimum height required to pour or reach the edge of the basket by users of relatively low height.
In block 3606, a method may include receiving data generated by a sensor. In an example, the computer device 102 may receive data from at least one of the sensors 304 and 306 during a period of exercise activity in which the user makes one or more jumps. As discussed above, the data may be raw signals or may be data processed by sensors before being sent to the computer 102.
In block 3608, the method may include determining whether the data is within a presentation area. In an example, the computer 102 may process data received from at least one of the sensors 206 and 304 to determine that any jump performed by the user meets or exceeds a minimum jump height of the performance zone adjusted to the user's attributes. For example, computer 102 may determine that the minimum vertical jump required by the user to snap the ball into the basket is 30 inches based on the user attributes. The computer 102 may process data received from at least one of the sensors 304 and 306 to determine whether any jumps made by the user meet or exceed 30 inches. To determine the height of the vertical jump, the computer 102 may process data generated by at least one of the accelerometer and the force sensor and compare the data to the jump data to determine that the data is consistent with the jump (e.g., a user sitting in a chair does not merely lift their feet off the ground for a predetermined amount of time). In response to the comparison, the computer 102 may process the data generated by the accelerometer and the force sensor to determine a takeoff time, a landing time, and a rise time. Computer 102 may calculate the vertical jump height based on the rise time.
In block 3610, the method may include outputting the determined amount. In an example, the computer 102 may output a determined amount of whether the user is within the performance zone. The output may be at least one of audible and visible. The computer 102 may immediately provide an output that detects that the user is within the performance zone, or may output the determined amount at some later time (e.g., after exercise). The method may then stop, or may return to any of the foregoing steps.
When the tracking table is selected, the computer 102 may update the GUI to notify the user of the opportunity and location to participate in the event (e.g., basketball game), as shown in FIGS. 37-38. For example, the computer 102 may communicate a geographic location (e.g., a GPS location) to the server 134, which may answer an event that is ongoing nearby, or is scheduled to begin soon (e.g., within the next hour). Fig. 37 illustrates two example GUI displays for identifying nearby basketball courts. On the left, the GUI of the computer 102 may provide a listing of nearby basketball courts and may provide a map to assist the user in locating the selected court. The GUI also permits the user to add other courses along the address of the course. On the right, the GUI presents information about the selected course. For example, the GUI may display regular players (e.g., the king of the course most often played at that course), as well as performance indicators of various players at that course (e.g., the player with the highest vertical jump record at that course, the player with the most steps per second, etc.). The GUI may prompt the user to check-in to the selected course and may indicate the number of active players on that course. When registering entry, the computer 102 may communicate registration entry information to the server 134 via the network 132, and the server 134 may update a database to indicate the number of times the user has registered into the court. The server 134 may also communicate the check-in number to the computer devices of other users requesting information about the course via the network 132. The GUI devices may also assist the user in identifying a course that certain other users are playing.
FIG. 38 illustrates an example GUI for obtaining activity information about other participants. The GUI may permit the user to search for friends or other individuals to determine where they are currently. The server 134 may store information about who is playing at each court (or other location) and, when requested, may communicate this information to the user. The user may also establish a user profile that identifies individuals of interest with whom the user may wish to compete or compete. Each user may be associated with a unique identifier that may be stored by the user profile and/or by the server 134. The computer 102 may communicate a query containing unique identifiers of one or more users to the server 134, which may respond with information about the queried user. As seen in FIG. 38, the GUI may display information about who of the selected users is playing, as well as the history of those users and/or the achievements of the users who are not currently playing. When the computer 102 requests information about a particular venue, the server 134 may communicate to the computing device 101 data of users who have played on the particular venue (e.g., highest vertical jump, number of regular players, etc.).
The GUI may be used to assist the user in finding an ongoing and/or upcoming item, identifying other players, and/or reviewing leaderboards. The GUI may permit the user to start a new item (e.g., a basketball game) and invite other players at a certain time (e.g., play a soccer game with me at a high school course at two pm). The GUI may also display leader board information.
As seen in FIG. 38, the history field may inform the user of the achievements of other individuals. For example, the computer 102 may communicate alert (alerts) data to the server 134, distributing achievements about the user to other computing devices. The user may choose to receive alerts for certain other users, such as by sending information from the computer 102 to the server 134 with the unique identifier of the certain other users. Before the user starts the project, the user may indicate which performance indicators the user wishes the computer 102 to monitor during the project.
Fig. 39 illustrates a process that may be used to find a location for a sporting activity according to an embodiment. First, in step 3902 a server or other computing device receives location information identifying a location of a user. The location information may be in the form of GPS data and may be received by a portable device, such as a mobile phone. Next, in step 3904 a server or other computing device receives activity information identifying a sporting activity. The activity information may be a desired sporting activity such as basketball, football, or soccer. The user may enter information at the mobile phone, and the mobile phone may transmit the information to the server. Next, in step 3906, a server or other computing device may process the location information and the activity information to identify a location near the user to participate in the sporting activity. Step 3906 may include identifying basketball courts, soccer courts, etc. that are currently being used for, or will be used for, the sporting activity. Step 3906 may include accessing a database of athletic activity and a geographic database. In step 3908, the results may be transmitted to a user.
FIG. 40 illustrates a process of sharing performance data according to an embodiment of the invention. First, in step 4002, location information of a user participating in a sporting activity is determined at a mobile terminal. Step 4002 may include using a GPS function of the mobile phone to determine a location of a user participating in a basketball or soccer game. Next, in step 4004, the location information is processed at the processor to determine an identification of a location of the athletic activity. Step 4004 may include processing the GPS data to determine a name of a basketball court or a soccer court. In step 4006, sensor data regarding a performance of a user participating in a sporting activity may be received at a mobile terminal. The sensor data may be from one or more of the sensors described herein. The sensor data may be processed at the processor to generate performance data in step 4008. The processing may be performed at the mobile terminal. In some embodiments, all or some of the processing may be performed by one or more sensors. The performance data may include speed, distance, vertical jump height, and foot speed. In step 4010, the identification of the location of the athletic activity and the performance data may be transmitted to a server. The server may maintain a collection of representations of various users and locations.
FIG. 41 illustrates a process that may be used to track and compare performance data according to an embodiment of the invention. In step 4102, performance information is received at a server from sensors worn by users participating in a sporting activity. Step 4102 can include receiving information from the sensor at the server via one or more computers, mobile terminals, or other devices in the path between the sensor and the server. The sensors may include one or more of the sensors described herein. In step 4104, location information for the geographic location of the sporting activity may also be received at the server. The location information may be GPS information, the name of the venue, or other information used to identify the location. In step 4106, a database of performance data for the user and performance data associated with the geographic location is maintained. Step 4106 can include maintaining a plurality of databases and collections of data. Finally, in step 4108, a leaderboard of performance data is maintained. Step 4108 may include maintaining a leaderboard identifying a user's maximum vertical jump height or other performance data. Step 4108 may also include maintaining a leaderboard of maximum vertical jump heights or other performance data obtained at the identified geographic location, such as a basketball court or a soccer court.
In embodiments that utilize location data to maintain a leaderboard or user's statistics at a particular location, GPS data may be used to determine when the user has left the location. For example, a mobile phone or other device may periodically analyze GPS data to determine when the user has left a basketball court. Similarly, the sensor data may be analyzed to determine when the user has stopped participating in the activity. In other embodiments, the user may be determined to have left the court or venue or to have stopped participating in the athletic activity while participating in the call. Some embodiments may include prompting the user to confirm that he/she has left or stopped participating in the athletic activity while participating in the call. Some embodiments may also ignore sensor data when participating in a call.
The various embodiments of the present invention described above discuss using GPS data to identify location. Alternate embodiments may determine location by using other techniques, such as WiFi database mapping services. The user may also manually enter location data or search a database of location data.
While the invention has been described with respect to specific examples including presently preferred modes of carrying out the invention, those skilled in the art will appreciate that there are numerous variations and permutations of the above described systems and methods. For example, various aspects of the invention may be used in different combinations and various different subcombinations of aspects of the invention may be used together in a single system or method without departing from the invention. In one example, the software and applications described herein may be implemented as computer-readable instructions stored in a computer-readable medium. Also, various elements, components, and/or steps described above may be changed, changed in order, omitted, and/or additional elements, components, and/or steps may be added without departing from this invention. Accordingly, the present invention should be construed broadly. Accordingly, the present invention should be construed broadly.
Other aspects relate to data-dependent image data relating to physical activity, such as including, but not limited to, any of the raw data and/or processed data disclosed in any of the embodiments above. Data (raw or processed) relating to physical activity may be obtained directly or indirectly, and/or may be derived from one or more sensors, including those described herein. According to some embodiments, the physical activity data may be overlaid on an image (or sequence of images, e.g., video) of the user, such as user 124, that was captured during the performance of the physical activity.
Fig. 42 is a flow diagram of an example method that can be utilized in accordance with various embodiments. At exemplary block 3702, image data may be obtained. Image data is collected from one or more image capture devices, such as a camera, video camera, still image camera, and/or any apparatus configurable to detect wavelengths of energy, including light, magnetic fields, and/or thermal energy, located on a mobile terminal device (see element 138 in fig. 1A). As used herein, "image data" may encompass raw data and/or compressed data, either in physical tangible form or stored on a computer-readable medium as an electronic file. Further, the plurality of images may form part of a video. Thus, references to images and/or pictures encompass video and the like.
In one embodiment, image data, such as information obtained during performance of a user's physical activity (e.g., participating in a basketball game and/or performing a specific action, such as pouring a ball into a basket) may be collected from one or more devices. For example, a computer-readable medium may include computer-executable instructions that, when executed, may obtain a plurality of images (e.g., videos) of an athlete engaged in a sporting activity. For example, mobile terminal 138 may include an application that permits user 124 (or another user) to capture image data using an image capture device (part of mobile terminal 138 or providing input to an external image capture device, such as camera 126).
In one embodiment, the simultaneous capture of video and physical activity sensor data may be initiated when a user activates a recording function (which may be a hard button or a soft button) on a host device (e.g., mobile terminal 138). In some embodiments, multiple cameras may be utilized simultaneously. Multiple cameras may be used, for example, based on the user's location (e.g., through detection via GPS, triangulation, or motion sensors). The image data may be obtained in response to a user operating a camera on a device, such as the camera on mobile terminal 138. In one embodiment, user 124 may provide mobile terminal 138 to another individual who may capture video of user 124 while playing a sports game or performing a fitness activity. However, in other embodiments, one or more cameras may be at a fixed position, angle, focus, and/or combinations thereof. In some embodiments, the image data may be obtained from a broadcast source that is not directly controllable by user 124 (and/or an individual or entity in the direction of user 124), such as, for example, a content source provider. For example, a content source provider may broadcast (live and/or delayed play) a sporting event. In one embodiment, the event may include a scheduled basketball game. However, in another embodiment, the sporting event may comprise an unscheduled event, such as a street (pickup) game. In certain embodiments, multiple camera feedback (feeds) may be utilized to determine which feedback(s) or image sources are used.
In one embodiment, the image data may be acquired based only on the sensor data. In one embodiment, the sensor data may be physical activity data. For example, in some embodiments, image data may only be acquired when it is determined that the user is within the "performance zone". In another embodiment, at least one body attribute value must satisfy a threshold. Other embodiments may optionally (indiscriminority) acquire image data of user 124, and optional block 3704 or another process may be performed to select a portion of the acquired image data. For example, block 3702 may acquire 20 minutes of image data for user 124, however, block 3704 may select only those portions where user 124 is in the presentation area. Those skilled in the art will readily appreciate that other selection criteria are within the scope of the present disclosure.
The image data obtained in block 3702 (and/or selected at block 3704) may be stored on one or more non-transitory computer-readable media, such as server 134, network 132, mobile terminal 138, and/or computer 102. The type and/or form of the image data may depend on a myriad of factors including, but not limited to: physical activity data (e.g., obtained from sensors), user selections, calibration parameters, and combinations thereof. The image data may be time stamped. Time stamping of image data may be performed as part of the collection and/or storage of image data. The timestamp information may include a "relative" timestamp that does not depend on the actual time of acquisition, but rather is associated with another event, such as a data point of the activity data, a start time, and/or any other event. In another embodiment, an "actual" timestamp may be utilized, wherein the time of acquisition may or may not be related to another event. Those skilled in the art will appreciate that both types of stamps can be utilized, including utilizing a single actual time stamp that is also related to another event.
At block 3706, physical activity data may be received. The activity data may also be time stamped, as discussed above with respect to image data. In one embodiment, sensor data may be received, which may include raw information and/or processed information relating to the activity of user 124. The activity data may be obtained from one or more sensors described herein. For example, in one embodiment, the user's footwear may include at least one sensor. In some embodiments, at least a portion of the athletic data may remain on the sensing device or another device operatively connected to the user (e.g., a wrist-worn device and/or a shoe-mounted sensor) until the acquisition period ends. The data may then be combined into a single file using a time stamp. Some embodiments may store a single file, but transmit a first portion of data (such as image data) that is spaced apart from a second portion (such as activity data). In another embodiment, the first portion of data (such as image data) may be stored spaced apart from the second portion (such as activity data), but may be transmitted to the first tangible computer-readable medium as a single file.
Multiple sensors (from one or more devices) may be utilized. In one embodiment, raw accelerometer and/or gyroscope data may be obtained and processed. In another embodiment, force sensor data may be received. In yet another embodiment, the physical activity parameter may be calculated based on one or more raw parameters from a plurality of sensors. As an example, FIG. 9 illustrates a plurality of data parameters, which may be obtained in accordance with certain embodiments. In some embodiments, user 124, sensor data, and/or sensors utilized to obtain data (and/or provide calculations of any processed data) may be optional. For example, user 124 (or another input received by another source, either manually or automatically) may select sensor 140 associated with the shoe and/or other apparel. In this regard, the input may not be limited to the user 124, such as a coach, trainer, parent, friend, broadcasting individual, and/or any other individual that may select a source of one or more activity data. Other embodiments may calibrate one or more sensors before utilizing corresponding data. In still other embodiments, data from one or more sensors may not be used if calibration parameters are not obtained. FIG. 10 illustrates an exemplary embodiment of calibration; however, the present disclosure is not limited to this embodiment. As discussed above with respect to image data, at least a portion of the physical activity data may be selected for processing and/or utilization.
At block 3708, the image data and the physical activity data may be associated. The association may be based on a time stamp of the data such that the physical activity data matches the image data corresponding to the acquisition time. In still other embodiments, the data may be filtered, processed, or otherwise adjusted to match one another. For example, each image in a first video of a user 124 engaged in athletic activity may represent the 1/20 th second in the first video, however, data from the first sensor may provide activity data values every 1/5 second, and thus, in one embodiment, four consecutive "frames" of image data during the 1/20 second may be associated with sensor data acquired during the 1/5 second increments. In still other embodiments, multiple physical activity values may be weighted, averaged, or otherwise adjusted to be associated with a single "frame" or aggregate image. The correlation of images may be implemented on one or more computer readable media.
The correlation of at least a portion of the data may be performed on a real-time basis, and/or at a later time. Correlation does not occur until selection of a portion of the selected data occurs. In some embodiments, the data may not be relevant until a particular user is selected. For example, the images and/or physical activity data may be correlated when a winner of the game is determined, or when an event occurs (e.g., a user dunking a basket). Further, the type and amount of data being correlated may also be selectable. For example, when determining that a user is basketball, correlation of image and/or activity data that occurs 10 seconds before the basketball and extends to 3 seconds after the basketball may be performed. In one embodiment, when a player is determined to win a game or event, a larger portion of their data will be correlated. For example, data covering the entire time frame of a game or event may be utilized. Further, the relevant data may depend on collected events, data, or other variables. For example, for a basket, activity data from or collected from one or more force sensors within a user's shoe may be utilized, while in a soccer game, arm swing data may be utilized, alone or in combination with other data, to determine steps per second, speed, distance, or other parameters. Relevant data may include, but is not limited to: identification of the sensing unit, the particular sensor, the user, the time stamp(s), calibration parameters, confidence values, or a combination thereof.
In other embodiments, the system 100 may receive and/or process data generated by sensors (such as force sensors) to determine weight distribution during performance of an exercise task (e.g., making a jump shot in basketball). The system 100 can correlate the detected weight distribution over time between user inputs to determine relevant initial and/or stopping points for particular data. At a later time, the system 100 may also process additional user input indicating that the exercise task was not successfully completed.
The system 100 may process sensor data, such as, for example, data received from the pod sensors 304 and/or the FSR sensors 206 during a project, to determine which data may be classified and/or correlated. For example, a user's jerkiness (hustle) in the course of a project may be classified into two or more categories. Referring to the jerk display 1902B, the system 100 may classify jerks into three categories: walking, jogging, running, and sprinting. Referring to the jerk display 1902C, the system 100 may classify jerks into three categories: low, medium and high. More or fewer categories of jerkiness may be defined. The system 100 may process the data to identify categories based on a pace rate taken by the user at each time interval (e.g., number of steps per minute). The relevant physical activity data may include information indicating when and/or how often the user is in each category in the project. In some embodiments, only physical activity indicated as being in one or more specific categories may be correlated with corresponding image data.
In some embodiments, data may be transmitted and displayed on one or more devices. In some embodiments, the display device may be physically distinct from the device that acquired the image(s) (e.g., see block 3710). For example, in one embodiment, an individual may utilize a portable device, such as a mobile terminal, to capture video of a user 124 performing a physical activity, such as participating in a basketball game. Information about the captured images may be transmitted (before or after correlation with the data of the physical activity of the design user 124) via wired and/or wireless media.
FIG. 13, discussed above, shows an illustrative example GUI providing performance metrics in an event, game, or project, according to an example embodiment. One or more of these metrics may relay information about the length of the current or previous project in field 1304, present various performance metrics of the user (e.g., highest vertical, total dead time, cadence, etc.) in field 1308, and present who the user will work with during the project in field 1310. According to some embodiments, one or more of these metrics may overlay corresponding image data. The image data may be combined to form a video, which may be stored as a single file such that the data overlay is part of the video and displayed to the captured data along with the corresponding video portion. In other embodiments, the second file may store data spaced apart from the video data.
In one embodiment, the image data (and/or physical activity data) may be transmitted in real-time. One or more images (with corresponding activity data) may be displayed on one or more display devices, such as a display at the location of a basketball court, or any other display media, including, but not limited to, multi-broadcast to multiple display devices. The images (and associated activity data) may be viewed via a television, a computing device, a network interface, and/or combinations thereof. In some embodiments, user 124 and/or other individuals may selectively determine which activity data is displayed on one or more display devices. For example, a first viewer may selectively view the user's current speed and/or average speed, and a second viewer may selectively view one or more different activity values, such as, for example, highest vertical jump, number of sprints, average speed, and combinations thereof. In this regard, the data may be formed and/or updated by a long duration, such as the total play time of a game, or a portion (section, half, etc.) of a game. Thus, there is no requirement that the image data relate only to data obtained during the acquisition of the image data, but instead may also include (or be derived from) previously obtained data. Other embodiments may present images and/or physical activity data for sharing with friends and/or posting to social networking sites. The transmission of any data may be based on, or at least partly based on, at least one criterion, such as, for example, a user-defined criterion, i.e. at least a portion of the data meets a threshold. For example, a user may only wish to upload their best performance(s).
Thus, certain embodiments may utilize historical data. As one example, jump data (such as that shown in jump display 1802B) may display the user's jumps chronologically during the activity period, and may indicate the time at which each jump occurred and the vertical height of each jump during the activity period. Jump display 1802B may also display the user's current data and/or the user's personal optimal vertical jump during the event.
Furthermore, as discussed above in relation to data, the display of any data (and/or the selection of which physical activity data and image data are displayed) may vary, depending on one or more variables; including, for example, a threshold has been met by the game, the type of event, a selection or input by the user 124, an input by the viewer, an indication of the performance of the user 124; for example, a performance zone is reached, and/or combinations thereof. Based on one or more computer-readable instructions on a non-transitory computer-readable medium, other embodiments may determine which activity value(s) may be displayed to viewer(s) and the duration of time certain values are displayed for a particular time period.
In some implementations, the image data may not be correlated with at least a portion of the activity data until a later time. The image data transmission and/or correlation with the activity data may be performed on a regular basis, such as every 1 second, 10 seconds, 30 seconds, 1 minute, or any time increment. In this regard, the system and/or the user may determine to evaluate one or more metrics at a later time. These indicators may be based on, for example, the type of athletic activity being performed in the item (e.g., basketball game, soccer game, running item, etc.). Some embodiments may permit evaluation and/or analysis of different metrics, rather than initially viewed and/or desired when the image(s) are acquired. For example, the user 124 and/or coach may initially be interested in evaluating the user's vertical jump meeting a first threshold (e.g., about 4 inches), while at a later time, the coach or user 124 may want to evaluate the image(s) with an overlay of steps per unit time (e.g., steps per minute). In some embodiments, the computer 102 may prompt the user to indicate which metrics to monitor for each type of item (e.g., baseball, football, basketball, etc.), and store the identified metrics in the user profile. In yet another embodiment, the type of item may be derived from collected data, including, but not limited to, activity data or image data.
The computer 102 may also prompt the user at the beginning of each project for desired metrics, i.e., which data to collect — including, but not limited to, data that may not be overlaid on the image. Other embodiments may adjust the image data selected and/or utilized. For example, variables may include resolution, frame rate, storage format protocol, and combinations thereof. At the beginning of the project, sensors, such as sensors within the shoe (see device sensors 140) and/or other sensors may be calibrated. In still other embodiments, the sensors may be calibrated during or after an item or event. In some embodiments, previously collected data may be utilized in determining whether to calibrate and/or the parameters calibrated.
Block 3710 and/or other aspects of some embodiments may involve generating and/or displaying summary segments (summary segments) with the image data. For example, the image data may be utilized to form a 25 second video. In some embodiments, the video file may be formed to include segments (e.g., 5 seconds), such as at the end of 25 seconds of image data, that provide a summary of certain statistics. In embodiments where the video is a single file, the segments may also form part of the same single file. In some embodiments, the summary screen (or another summary) may be presented to the user when the video file is created (e.g., during a time in which the image data is properly aligned with the sensor data). Other information may be displayed by means of the image data. For example, in one embodiment, the overlay may be displayed at the origin of the data; such as by wrist-worn or shoe-mounted sensors, and/or specific manufactures or models of sensors.
Other aspects relate to creating and/or displaying a "representative image" formed from images within a collection of images (e.g., see block 3712). The representative image may be utilized as a "thumbnail" image or a cover image. In other embodiments, the representative image may be used to represent a particular video of a plurality of videos, each of which may have its own representative image. In one embodiment, the representative image may be selected based on a time-dependent data value representing the highest value of the at least one athletic parameter. For example, the highest value of the jump (e.g., vertical height) may be utilized to select the image. In still other embodiments, the highest values relating to velocity, acceleration, and/or other parameters may be utilized in the selection image. Those skilled in the art will appreciate that the "best" data value may not be the highest, and thus the present disclosure is not limited to image data associated with the "highest" value, but includes any data.
In other embodiments, the user (or any individual) may select which parameter or parameters are desired. In still other embodiments, computer-executable instructions on a tangible computer-readable medium may select parameters based on the collected data. In still other embodiments, multiple images may be selected based on the related physical activity data and allow the user to select one. Any physical activity data and/or image data may be associated with the location data, such as GPS or a particular court.
Other embodiments involve establishing a set of image data from multiple users based on the sensed data (e.g., see block 3714). In one embodiment, a "highlight reel (reel)" may be formed that includes image data for a plurality of users. In one example, the highlight reel may be established from data obtained from a sporting event. For example, multiple players in one or more teams may be recorded, such as during a televised sporting event. Based on the sensed athletic data, images (e.g., video) obtained during the performance of the data may be aggregated to create a highlighted reel of the sporting event or a portion thereof (e.g., the first and/or last two minutes). For example, the sensor may obtain athletic data from the player during the sporting event, and based on at least one criterion (i.e., a jump greater than 24 inches and/or a pace greater than 3 steps per second), the related image data may be utilized in forming the highlight reel.
Certain embodiments relate to generating feedback (feed) or multiple image sets based on at least one criterion. For example, viewers of a sporting event typically do not have the time to watch each game or contest, such as during a post-season game for the sporting event. Thus, in one embodiment, feedback may be selectively limited to physical activity of certain team(s) and friends, teams, or following athletes of a particular player(s) in the basketball game in which particular parameter value(s) are implemented. Thus, in some embodiments of the invention, the image data may include image data acquired during a first time period and image data acquired during a second time period different from the first time period. The feedback may also be categorized based on the type of activity and/or the sensors utilized to acquire the activity. In some embodiments, the highlighting of the reels and/or the feedback may be based, at least in part, on whether the player(s) are within the performance zone.
In one embodiment, the image data acquired during the first time period is at a first geographic location and the image data acquired during the second time period is at a second geographic location. In some embodiments, images from two or more locations obtained during two different time periods may be combined into a single image. In one embodiment, the user's physical performance may be captured via a mobile phone or other device and merged with image data corresponding to historical athletic performance or known venues. For example, a video of a user shot may be merged with a video of a three-minute shot by a known player in the last minute. In some embodiments, the user may capture an image of the screen prior to recording a video of the user playing the athletic movement at the same location. The mobile phone, or other device, may then remove the screen data from the video to isolate the user. The isolated video of the user may then be merged, or overlaid, with an image or video of another location or event. Similarly, selected portions of the acquired image data may be replaced. For example, a video of a user tapping a tennis ball may be edited to replace a basketball with a tennis ball. Various other features and devices may be used in accordance with aspects described herein. Additional or alternative features may also be incorporated into the devices and/or applications associated therewith.
Aspects of the embodiments that have been described are described in terms of exemplary embodiments thereof. Various other embodiments, modifications and variations will occur to those skilled in the art upon reading the present disclosure within the scope and spirit of the appended claims. For example, those of skill in the art will appreciate that the steps illustrated in the exemplary figures may be performed in an order different than illustrated, and that one or more of the steps illustrated may be optional in accordance with aspects of the embodiments.

Claims (30)

1. A method, comprising:
receiving a first user input from a first user specifying a challenge for one or more users, wherein the challenge is indicative of at least a first skateboard movement to be performed by one or more other users and a plurality of associated performance characteristics;
receiving a first performance data set generated by one or more sensors that measure athletic activity performed by a second user;
determining, by the one or more processors and based on the first performance data set, a first activity metric set corresponding to the athletic activity performed by the second user and the plurality of associated performance characteristics;
determining that at least a portion of the first performance data set indicates that the second user is within a performance zone of the first skateboard movement during the exercise program;
selecting image data associated with the performance of the athletic activity to be displayed based on the determination that the second user is within the performance zone of the first skateboarding movement;
determining, by the one or more processors and based on a first performance data set indicating that the first activity metric set satisfies a plurality of thresholds corresponding to at least one of the plurality of associated performance characteristics specified by a first user input, that a second user successfully performed a first skateboarding movement; and
an indication that the second user successfully performed the first sled motion is output to a display device.
2. The method of claim 1, wherein the first user input from the first user specifies a combination of two or more skateboard movements, the method further comprising:
receiving a second user input indicating a first sequence for performing a combination of two or more skateboard movements; and
a first time period for a combination of two or more sled motions is determined by one or more processors.
3. The method of claim 1, further comprising:
determining, by the one or more processors and based on the first performance data set, a recommended athletic activity to assist the second user in achieving the performance zone for the first skateboard movement.
4. The method of claim 1, further comprising:
receiving input indicative of one or more portions of a first performance data set associated with a performance zone that effects the first skateboard movement; and
a minimum activity metric used to determine whether a performance zone has been achieved is adjusted based on the input.
5. The method of claim 1, further comprising:
determining a location of a second user; and
determining a first sled motion associated with the challenge based on the location of the second user.
6. The method of claim 1, further comprising:
determining weight distribution data for the second user based on the first performance data set; and
the weight distribution of the second user is associated with the results of the athletic activity performed by the second user.
7. The method of claim 1, further comprising:
receiving a second data set generated by one or more sensors associated with a second user; and
it is determined whether the second data set corresponds to the first sled motion.
8. The method of claim 1, wherein the first user input specifies a plurality of sled motions, the method further comprising:
displaying, via the user interface, a representation of the first user's progress toward successfully performing the plurality of skateboard movements.
9. The method of claim 1, further comprising:
the challenge is adjusted to include a second sled motion to be performed by the first user.
10. The method of claim 9, wherein the challenge is adjusted to include a second skateboard movement based on the athletic performance of the first user.
11. The method of claim 1, further comprising:
it is determined whether the first user has sufficient exercise experience to perform the first sled motion.
12. The method of claim 11, wherein determining whether the first user has sufficient athletic experience further comprises:
it is determined whether the first user has previously performed one or more of the plurality of branch motions of the first sled motion.
13. The method of claim 1, wherein the first sled motion comprises a plurality of branched motions, the method further comprising:
in response to determining that the second user has previously performed at least one of the plurality of branch actions, determining, by the one or more processors, that the second user has sufficient motion experience to perform the first sled motion.
14. The method of claim 1, further comprising:
determining a second skateboard movement and an associated second set of performance characteristics based on a position of a user of the one or more other users; and
in response to determining the second sled motion, the second sled motion and associated second performance characteristic set are communicated to the user.
15. The method of claim 1, wherein the associated first performance characteristic of the plurality of associated performance characteristics indicated by the first user input comprises a rotational force exhibited on a skateboard.
16. The method of claim 1, further comprising:
determining, by the one or more processors, that the first user satisfies a first threshold associated with a first performance feature of a plurality of associated performance features indicated by the first user input; and
determining, by the one or more processors and based on the first performance data set indicating that the one or more activity metrics meet the first threshold, that the second user successfully performed the first skateboarding movement.
17. The method of claim 1, wherein a first performance feature of the plurality of associated performance features comprises a parameter indicative of one of a speed, an acceleration, a position, a rotational force, or an elevation of the skateboard.
18. The method of claim 6, further comprising:
generating a successful outcome of the activity recommendation for the second user to perform the athletic activity based at least in part on the weight distribution data.
19. The method of claim 6, further comprising:
determining how the weight distribution of the second user affects the performance of at least a second skateboard movement by the second user.
20. An apparatus, comprising:
one or more processors; and
a memory storing computer-executable instructions that, when executed by the one or more processors, cause the apparatus to at least perform:
receiving user input from a first user specifying a challenge for one or more users, wherein the challenge is indicative of at least a first skateboard movement and a plurality of associated performance characteristics to be performed by one or more other users;
receiving a first performance data set generated by one or more sensors that measure athletic activity performed by a second user;
determining, by the one or more processors and based on the first performance data set, a first activity metric set corresponding to the athletic activity performed by the second user and the plurality of associated performance characteristics;
determining that at least a portion of the first performance data set indicates that the second user is within a performance zone of the first skateboard movement during the exercise program;
selecting image data associated with a performance of the athletic activity to be displayed based on determining that the second user is within the performance area of the first skateboard movement;
determining, by the one or more processors and based on a first performance data set indicating that a first activity metric set satisfies a plurality of thresholds corresponding to at least one of a plurality of associated performance characteristics specified by a first user input, that a second user successfully performed the first skateboarding movement; and
an indication that the second user successfully performed the first sled motion is output to a display device.
21. The apparatus of claim 20, wherein the first user input from the first user specifies a combination of two or more sled motions, and the computer-executable instructions, when executed by the one or more processors, cause the apparatus to:
receiving a second user input indicating a first sequence for performing a combination of two or more skateboard movements; and
a first time period for a combination of two or more skateboard movements is determined.
22. The apparatus of claim 20, wherein the computer-executable instructions, when executed by the one or more processors, cause the apparatus to:
a recommended athletic activity is determined based on the first performance data set to assist the second user in achieving the performance zone for the first skateboard movement.
23. The apparatus of claim 20, wherein the computer-executable instructions, when executed by the one or more processors, cause the apparatus to:
receiving input indicative of one or more portions of a first performance data set associated with a performance zone that effects movement of a first skateboard; and
a minimum activity metric used to determine whether a performance zone has been achieved is adjusted based on the input.
24. The apparatus of claim 20, wherein the computer-executable instructions, when executed by the one or more processors, cause the apparatus to:
determining a location of a second user; and
determining a first sled motion associated with the challenge based on the location of the second user.
25. The apparatus of claim 20, wherein the computer-executable instructions, when executed by the one or more processors, direct the apparatus to:
receiving a second performance data set generated by one or more sensors associated with a third user;
determining a second set of activity metrics corresponding to the athletic activity performed by the third user and the plurality of associated performance characteristics based on the second set of performance data;
determining whether the second performance data set corresponds to first skateboard movement;
determining, based on a second performance data set indicating that the second activity metric set satisfies a plurality of thresholds corresponding to at least one of the plurality of associated performance characteristics specified by the first user input, that the second performance data set indicates that the third user successfully performed the first skateboard movement; and
an indication that the third user successfully performed the first sled movement is output to the display device.
26. The apparatus of claim 25, wherein the computer-executable instructions, when executed by the one or more processors, cause the apparatus to:
the challenge is adjusted to include a second sled motion to be performed by the first user and the second user.
27. The apparatus of claim 20, wherein a first user input from a first user specifies a combination of two or more sled motions, and wherein the computer-executable instructions, when executed by the one or more processors, cause the apparatus to:
receiving a second user input indicating a first sequence for performing a combination of two or more sled motions; and
a first time period for a combination of two or more skateboard motions is determined.
28. The apparatus of claim 20, wherein the computer-executable instructions, when executed by the one or more processors, cause the apparatus to:
it is determined whether the first user has sufficient exercise experience to perform the first sled motion.
29. The apparatus of claim 28, wherein the first sled motion comprises a plurality of branched motions.
30. The apparatus of claim 29, wherein the computer-executable instructions, when executed by the one or more processors, cause the apparatus to:
it is determined whether the first user has previously performed one or more of the plurality of branches of the first sled motion.
CN202210766948.2A 2013-05-31 2014-05-30 Skateboard system Pending CN115545959A (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US201361829809P 2013-05-31 2013-05-31
US61/829,809 2013-05-31
CN201480043593.8A CN105453117A (en) 2013-05-31 2014-05-30 Skateboard system
PCT/US2014/040351 WO2014194266A1 (en) 2013-05-31 2014-05-30 Skateboard system

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
CN201480043593.8A Division CN105453117A (en) 2013-05-31 2014-05-30 Skateboard system

Publications (1)

Publication Number Publication Date
CN115545959A true CN115545959A (en) 2022-12-30

Family

ID=51134291

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201480043593.8A Pending CN105453117A (en) 2013-05-31 2014-05-30 Skateboard system
CN202210766948.2A Pending CN115545959A (en) 2013-05-31 2014-05-30 Skateboard system

Family Applications Before (1)

Application Number Title Priority Date Filing Date
CN201480043593.8A Pending CN105453117A (en) 2013-05-31 2014-05-30 Skateboard system

Country Status (3)

Country Link
EP (1) EP3005193A1 (en)
CN (2) CN105453117A (en)
WO (1) WO2014194266A1 (en)

Families Citing this family (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9984549B2 (en) 2015-12-14 2018-05-29 Intel Corporation Networked sensor systems and methods
TWI618410B (en) * 2016-11-28 2018-03-11 Bion Inc Video message live sports system
EP3332697B1 (en) * 2016-12-12 2019-08-14 The Swatch Group Research and Development Ltd Method for detecting and calculating the duration of a jump

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3360647B2 (en) 1998-09-16 2002-12-24 トヨタ自動車株式会社 Car body front structure
US8206219B2 (en) 2002-10-30 2012-06-26 Nike, Inc. Interactive gaming apparel for interactive gaming
EP2330937B1 (en) 2008-06-13 2016-11-16 NIKE Innovate C.V. Footwear having sensor system
US20100076347A1 (en) * 2008-09-25 2010-03-25 Mcgrath Michael J Dynamic movement analysis system
RU2011125523A (en) * 2008-12-22 2013-01-27 Кристал Волд Холдингс, Инк. INDEX DEFINITION SYSTEM, FINANCIAL INSTRUMENT EXCHANGE SYSTEM AND METHOD OF RISK MANAGEMENT OF ENTERPRISES RELATED TO SPORTS
US20120116714A1 (en) * 2010-08-03 2012-05-10 Intellisysgroup Llc Digital Data Processing Systems and Methods for Skateboarding and Other Social Sporting Activities
WO2012018914A2 (en) * 2010-08-03 2012-02-09 Intellisys Group, Llc Digital data processing systems and methods for skateboarding and other social sporting activities
KR101549761B1 (en) * 2010-11-05 2015-09-02 나이키 이노베이트 씨.브이. Method and system for automated personal training
US8593286B2 (en) * 2010-12-01 2013-11-26 At&T Intellectual Property I, L.P. System and method for wireless monitoring of sports activities
US20130024248A1 (en) * 2011-02-17 2013-01-24 Nike, Inc. Retail Training Application
US20120296235A1 (en) * 2011-03-29 2012-11-22 Rupp Keith W Automated system and method for performing and monitoring physical therapy exercises
US20120259652A1 (en) * 2011-04-07 2012-10-11 Full Recovery, Inc. Systems and methods for remote monitoring, management and optimization of physical therapy treatment
US9504909B2 (en) * 2011-05-05 2016-11-29 Qualcomm Incorporated Method and apparatus of proximity and stunt recording for outdoor gaming

Also Published As

Publication number Publication date
EP3005193A1 (en) 2016-04-13
CN105453117A (en) 2016-03-30
WO2014194266A1 (en) 2014-12-04

Similar Documents

Publication Publication Date Title
US11594145B2 (en) Skateboard system
US20220028521A1 (en) Selecting And Correlating Physical Activity Data With Image Data
JP6827018B2 (en) Conducting a session using captured image data of physical activity and uploading using a token-verifiable proxy uploader
US20200298091A1 (en) Athletic Data Aggregation for Online Communities
US9381420B2 (en) Workout user experience
US20130002533A1 (en) User experience
US20130245966A1 (en) User experience
US20230186780A1 (en) Skateboard System
CN115545959A (en) Skateboard system

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination