US10996235B2 - System and method for cycle duration measurement in repeated activity sequences - Google Patents

System and method for cycle duration measurement in repeated activity sequences Download PDF

Info

Publication number
US10996235B2
US10996235B2 US16/236,753 US201816236753A US10996235B2 US 10996235 B2 US10996235 B2 US 10996235B2 US 201816236753 A US201816236753 A US 201816236753A US 10996235 B2 US10996235 B2 US 10996235B2
Authority
US
United States
Prior art keywords
cycle
frame buffer
cycles
frames
segmentation
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active, expires
Application number
US16/236,753
Other versions
US20200209276A1 (en
Inventor
Lincan Zou
Liu Ren
Cheng Zhang
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Robert Bosch GmbH
Original Assignee
Robert Bosch GmbH
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Robert Bosch GmbH filed Critical Robert Bosch GmbH
Priority to US16/236,753 priority Critical patent/US10996235B2/en
Assigned to ROBERT BOSCH GMBH reassignment ROBERT BOSCH GMBH ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: REN, LIU, ZHANG, CHENG, ZOU, LINCAN
Priority to DE102019219479.6A priority patent/DE102019219479A1/en
Priority to CN201911395387.4A priority patent/CN111385813B/en
Publication of US20200209276A1 publication Critical patent/US20200209276A1/en
Application granted granted Critical
Publication of US10996235B2 publication Critical patent/US10996235B2/en
Active legal-status Critical Current
Adjusted expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G5/00Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators
    • G09G5/36Control arrangements or circuits for visual indicators common to cathode-ray tube indicators and other visual indicators characterised by the display of a graphic pattern, e.g. using an all-points-addressable [APA] memory
    • G09G5/39Control of the bit-mapped memory
    • G09G5/393Arrangements for updating the contents of the bit-mapped memory
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01PMEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
    • G01P13/00Indicating or recording presence, absence, or direction, of movement
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W24/00Supervisory, monitoring or testing arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04WWIRELESS COMMUNICATION NETWORKS
    • H04W28/00Network traffic management; Network resource management
    • H04W28/02Traffic management, e.g. flow control or congestion control

Definitions

  • the present disclosure relates to systems and methods for measurement of cycle duration in repeated activity sequences.
  • aspects of human movement can have a large impact on, for example, the way a tool is designed, the way a workspace is laid out, or the way a task is performed. Understanding how the human body can move and interact with objects and the environment can result in tools that are more ergonomic, workspaces that are more efficient to navigate, and tasks that more intuitive to perform.
  • the range of possible human motions and gestures is vast, however, and simple tasks, such as lifting a cup, pointing in a direction, or turning a screw, often result from a complex set of biomechanical interactions. This relation of simple result from complex movement can make human motions and gestures extremely difficult to quantify or understand in a meaningful or practical way.
  • a system for measurement of cycle duration in repeated activity sequences includes a display device, a memory configured to store a motion analysis application and motion capture data including a reference activity sequence and a frame buffer including frames corresponding to one or more cycles of query activity sequences, and a processor, operatively connected to the memory and the display device.
  • the processor is configured to execute the motion analysis application to detect a cycle within the frame buffer using a global optimization, including to create a plurality of cycle segmentations by recursively iterating through the frame buffer to identify candidate cycles until the frame buffer lacks sufficient frames to create additional cycles, compute segmentation errors for each of the plurality of cycle segmentations, and identify the detected cycle as the one of the plurality of cycle segmentations having a lowest segmentation error, generate cycle duration data for the detected cycle, remove frames belonging to the detected cycle from the frame buffer, and output the cycle duration data to the display device.
  • a global optimization including to create a plurality of cycle segmentations by recursively iterating through the frame buffer to identify candidate cycles until the frame buffer lacks sufficient frames to create additional cycles, compute segmentation errors for each of the plurality of cycle segmentations, and identify the detected cycle as the one of the plurality of cycle segmentations having a lowest segmentation error, generate cycle duration data for the detected cycle, remove frames belonging to the detected cycle from the frame buffer, and output the cycle duration data to the display device.
  • a method for measurement of cycle duration in repeated activity sequences includes detecting, using a global optimization, a cycle within a frame buffer including frames corresponding to one or more cycles of query activity sequences, the detecting including creating a plurality of cycle segmentations by recursively iterating through the frame buffer to identify candidate cycles corresponding to cycles of a reference activity sequence until the frame buffer lacks sufficient frames to create additional cycles, computing segmentation errors for each of the plurality of cycle segmentations, and identifying the detected cycle as the one of the plurality of cycle segmentations having a lowest segmentation error, generating cycle duration data for the detected cycle; removing frames belonging to the detected cycle from the frame buffer; and outputting the cycle duration data to the display device.
  • non-transitory computer readable medium comprising instructions of a motion analysis application that, when executed by one or more processors, cause the one or more processors to detect, using a global optimization, a cycle within a frame buffer including frames corresponding to one or more cycles of query activity sequences, the detecting including to create a plurality of cycle segmentations by recursively iterating through the frame buffer to identify candidate cycles corresponding to cycles of a reference activity sequence until the frame buffer lacks sufficient frames to create additional cycles, compute segmentation errors for each of the plurality of cycle segmentations, and identify the detected cycle as the one of the plurality of cycle segmentations having a lowest segmentation error, generate cycle duration data for the detected cycle; remove frames belonging to the detected cycle from the frame buffer; continue to detect additional cycles within the frame buffer using the global optimization until the frame buffer lacks sufficient cycles for an additional cycle; and output the cycle duration data.
  • FIG. 1 illustrates an example system for measurement of cycle duration in repeated activity sequences
  • FIG. 2 illustrates an example of a cycle of a reference activity sequence and a frame buffer
  • FIG. 3 illustrates an example process for measurement of cycle duration in repeated activity sequences
  • FIG. 4 illustrates an example diagram of a cycle segmentation search within the frame buffer
  • FIG. 5 illustrates an example diagram of a calculation of segmentation error
  • FIG. 6 illustrates an example diagram of a determination of cycles from the calculated segmentations
  • FIG. 7 illustrates an example of preparation of the frame buffer for detection of the next cycle.
  • This disclosure proposes a solution to measure cycle duration in repeated physical human activities using the time-series data from IMUs, with improved performance.
  • the proposed approach measures cycle duration in repeated activity sequences streamed into or otherwise stored into a buffer.
  • the proposed approach utilizes global optimization of all activities stored in the buffer with the aim to minimize the total error of all cycle durations.
  • the cycle duration measurement accuracy is improved dramatically.
  • the accuracy and the response delay are positively correlated with the buffer size. For highly standardized repeated activities, a relatively smaller buffer size could still achieve the accuracy expectation. However, the response delay is reduced with the smaller buffer size. Further aspects of the disclosure are discussed in detail below.
  • FIG. 1 is a schematic diagram of an exemplary embodiment of a system 100 for measurement of cycle duration in repeated activity sequences.
  • the system 100 may quantitatively compute an accuracy, e.g., a deviation from a prescribed motion or gesture and deviation from a target time period for completion, of a movement.
  • the system 100 includes a processor 102 that is operatively connected to a memory 110 , input device 116 , motion capture device 118 , and a display device 108 .
  • the system 100 detects, using a global optimization, a cycle within a frame buffer including frames corresponding to one or more cycles of query activity sequences, the detecting including (a) creating a plurality of cycle segmentations by recursively iterating through the frame buffer to identify candidate cycles corresponding to cycles of the reference activity sequence until the frame buffer lacks sufficient frames to create additional cycles, (b) computing segmentation errors for each of the plurality of cycle segmentations, and (c) identifying the detected cycle as the one of the plurality of cycle segmentations having a lowest segmentation error, (ii) generating cycle duration data for the detected cycle; (iii) removing frames belonging to the detected cycle from the frame buffer; and (iv) outputting the cycle duration data.
  • the processor 102 includes one or more integrated circuits that implement the functionality of a central processing unit (CPU) 104 and graphics processing unit (GPU) 106 .
  • the processor 102 is a system on a chip (SoC) that integrates the functionality of the CPU 104 and GPU 106 , and optionally other components including, for example, the memory 110 , a network device, and a positioning system, into a single integrated device.
  • SoC system on a chip
  • the CPU 104 and GPU 106 are connected to each other via a peripheral connection device such as PCI express or another suitable peripheral data connection.
  • the CPU 104 is a commercially available central processing device that implements an instruction set such as one of the x86, ARM, Power, or MIPS instruction set families.
  • the GPU 106 may include hardware and software for display of at least two-dimensional (2D) and optionally three-dimensional (3D) graphics to a display device 108 .
  • the display device 108 may include an electronic display screen, projector, printer, or any other suitable device that reproduces a graphical display.
  • processor 102 executes software programs including drivers and other software instructions using the hardware functionality in the GPU 106 to accelerate generation and display of the graphical depictions of models of human movement and visualizations of quantitative computations that are described herein
  • the CPU 104 and GPU 106 execute stored program instructions that are retrieved from the memory 110 .
  • the stored program instructions include software that control the operation of the CPU 104 and the GPU 106 to perform the operations described herein.
  • FIG. 1 depicts the processor 102 as including both the CPU 104 and GPU 106
  • alternative embodiments may omit the GPU 106
  • the processor 102 may be of a server that generates output visualization data using only a CPU 104 and transmits the output visualization data to a remote client computing device that uses a GPU 106 and a display device 108 to display the data.
  • alternative embodiments of the processor 102 can include microcontrollers, application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), digital signal processors (DSPs), or any other suitable digital logic devices in addition to or as replacements of the CPU 104 and GPU 106 .
  • ASICs application specific integrated circuits
  • FPGAs field programmable gate arrays
  • DSPs digital signal processors
  • the memory 110 includes both non-volatile memory and volatile memory devices.
  • the non-volatile memory includes solid-state memories, such as NAND flash memory, magnetic and optical storage media, or any other suitable data storage device that retains data when the system 100 is deactivated or loses electrical power.
  • the volatile memory includes static and dynamic random-access memory (RAM) that stores program instructions and data, including a motion analysis application 112 and motion capture data 114 , during operation of the system 100 .
  • RAM static and dynamic random-access memory
  • the CPU 104 and the GPU 106 each have access to separate RAM devices (e.g., a variant of DDR SDRAM for the CPU 104 and a variant of GDDR, HBM, or other RAM for the GPU 106 ) while in other embodiments the CPU 104 and GPU 106 access a shared memory device.
  • the memory 110 may store the motion capture data 114 and motion analysis application 112 for maintenance and retrieval.
  • the input device 116 may include any of various devices that enable the system 100 to receive the motion capture data 114 and motion analysis application 112 .
  • suitable input devices include human interface inputs such as keyboards, mice, touchscreens, voice input devices, and the like, as well.
  • the system 100 implements the input device 116 as a network adapter or peripheral interconnection device that receives data from another computer or external data storage device, which can be useful for receiving large sets of motion capture data 114 in an efficient manner.
  • the motion analysis application 112 includes instructions that, when executed by the processor 102 of the system 100 , cause the system 100 to perform the processes and operations described herein. These processes and operations include (i) detects, using a global optimization, a cycle within a frame buffer 124 including frames corresponding to one or more cycles of query activity sequences, the detecting including (a) creating a plurality of cycle segmentations by recursively iterating through the frame buffer 124 to identify candidate cycles corresponding to cycles of the reference activity sequence until the frame buffer 124 lacks sufficient frames to create additional cycles, (b) computing segmentation errors for each of the plurality of cycle segmentations, and (c) identifying the detected cycle as the one of the plurality of cycle segmentations having a lowest segmentation error, (ii) generating cycle duration data 126 for the detected cycle; (iii) removing frames belonging to the detected cycle from the frame buffer; and (iv) outputting the cycle duration data 126 .
  • the motion capture data 114 refers to a plurality of records representative of the locations of at least one tracked item or portion of the item over time.
  • the motion capture data 114 may include one or more of: records of positions of a reference point on a body part over time or at set time intervals, sensor data taken over time, a video stream or a video stream that has been processed using a computer-vision technique, data indicative of the operating state of a machine over time, etc.
  • the motion capture data 114 may include data representative of more than one continuous movement.
  • the motion capture data 114 may include a combination of a plurality of combined motion capture data 114 sets.
  • a motion capture device 118 is a device configured to generate motion capture data 114 .
  • Motion capture devices 118 may include, as some non-limiting examples: cameras, visual sensors, infra-red sensors, ultrasonic sensors, accelerometers, pressure sensors, or the like.
  • One non-limiting example of a motion capture device 118 is one or a pair of digital gloves that a user wears while performing cyclical motions.
  • the digital gloves may include sensors that capture the motions of the user to generate the motion capture data 114 that may be stored in the memory 110 or elsewhere to a data storage device.
  • a movement is an action performed by an operator.
  • a reference movement refers to a baseline or canonical version of the movement.
  • the reference movement may be used as a standard of comparison for other movements, to allow for identification of how close the other movements are to the reference movement.
  • the motion capture data 114 may be generally classified into one of two categories for the purpose of computing the accuracy of a human motion: a reference activity sequence 120 that includes data representative of the reference or baseline movement, and a query activity sequence 122 that includes data representative of the test movement, i.e., a movement to be compared and quantitatively evaluated for accuracy relative to the baseline movement.
  • the reference activity sequence 120 may include motion capture data 114 received from the motion capture device 118 .
  • the data may also include processed movement data, such as, for example, frame, step, cycle, and time information gleaned from the raw movement data.
  • the reference movement is represented as a reference activity sequence 120 having an ordered set of frames that each includes motion capture data 114 corresponding to a respective interval of time of the reference movement.
  • the query activity sequence 122 may also include motion capture data 114 received from the motion capture device 118 .
  • a movement or movements in the motion capture data 114 includes a label or labels classifying the movements as either reference movements or test movements.
  • the motion analysis application 112 may be programmed is configured to receive instruction for classifying a movement or movements as reference movements or test movements, such as from a user via the input device 116 or from another source.
  • the motion analysis application 112 may also be programmed to separate a received movement into frames, whereby a “frame” corresponds to a discrete interval of time.
  • each frame of a movement includes a portion of the motion capture data 114 corresponding to a portion of the movement occurring during a respective interval of the timeline for that movement.
  • the duration for the interval corresponding to an individual frame is preset.
  • the duration for the interval corresponding to an individual frame is set based on an instruction received from, for example, the user via the input device 116 or another source.
  • the duration for the interval corresponding to an individual frame is set with reference to one or more characteristics of the motion capture data 114 .
  • the duration for the interval corresponding to an individual frame is set with reference to one or more of a duration of a reference movement, a total travel distance for the movement, a number of individual motions or gestures within the movement, a speed of the movement, etc.
  • the same interval for the duration of frames is used for both a reference movement and for test movements to be evaluated relative to the reference movement.
  • a frame buffer 124 is a data storage configured to hold frames of one or more query activity sequences 122 .
  • FIG. 2 illustrates an example of a cycle of a reference activity sequence 120 and a frame buffer 124 .
  • the reference activity sequence 120 includes four frames, while the frame buffer 124 includes repeated query activity sequences 122 with ten frames.
  • the frame buffer 124 may contain all sequences frames if the system 100 is used offline to process saved query activity sequence 122 data.
  • the frame buffer 124 may also be updated dynamically at runtime if the system 100 is used in an online mode to process streaming data.
  • a frame includes motion capture data 114 captured from wearable sensors and a timestamp of the captured data. If the wearable sensors are IMU sensors of a motion capture device 118 (such as a glove or smart clothing), the captured data may be received from accelerometer, gyro, and/or magnetometer sensors.
  • motion capture data 114 may be received as a file of stored motion capture data from a data storage device.
  • the motion analysis application 112 may separate the movement or movements in the motion capture data 114 into frames to be added to the frame buffer 124 for further processing.
  • the cycle duration data 126 may include information indicative of the durations of the cycles in the frame buffer 124 . This information may include, in an example, the elapsed time between the first frame and the last frame of the cycle within the frame buffer 124 . In an example, the cycle duration data 126 may be displayed to the display device 108 . As discussed in greater detail below, the system 100 may be configured to perform measurement of cycle duration of motion capture data 114 in the frame buffer 124 descriptive of repeated query activity sequences 122 .
  • the motion analysis application 112 may also map frames of the test movement to corresponding frames of the reference movement.
  • the cycle duration data 126 may be useful in identifying the boundaries of each of the cycles of query activity sequences 122 for comparison with the reference activity sequence 120 .
  • the test movement and reference movement are synchronized so that frames of the test movement are mapped to frames of the reference movement that correspond temporally, and in some embodiments, the test movement and the reference movement are aligned in terms of gestures and motions within the movement, such that frames of the test movement are mapped to frames of the reference movement that correspond with regard to the sequence of motions and/or gestures performed in the movement.
  • the motion analysis application 112 may further compute an accuracy of the test movement or test movements relative to the reference movement.
  • FIG. 3 illustrates an example process 300 for measurement of cycle duration in repeated activity sequences. For each iteration, one cycle is detected from the frame buffer 124 , the cycle information is generated, and the frames belonging to detected cycle are removed from the frame buffer 124 . Such iteration continues until the frame buffer 124 lacks sufficient frames to form an additional cycle.
  • the process 200 may be performed by the processor 102 of the system 100 , as discussed in more detail below.
  • the processor 102 receives the reference activity sequence 120 and the frame buffer 124 containing repeated query activity sequences 122 .
  • the reference activity sequence 120 may be retrieved from the memory 110 .
  • the frame buffer 124 may include one or both of previously-recorded data from the motion capture device 118 and dynamic data streamed at real-time into the frame buffer 124 from the motion capture device 118 .
  • FIG. 4 illustrates an example diagram 400 of a cycle segmentation search within the frame buffer 124 .
  • Each iteration of detection of a cycle finds a cycle that matches the cycle of the reference activity sequence 120 from the remaining data in the frame buffer 124 .
  • to find a cycle in the frame buffer 124 is equivalent to finding a cycle end frame from the remaining data in the frame buffer 124 .
  • One method of finding the cycle end frame is first to calculate an error value for each frame.
  • f i denotes the frame at index i in the buffer
  • c s i denotes the cycle consisting of frames from frame f s to frame f i in buffer
  • c r denotes the reference cycle of the reference activity sequence 120 ;
  • f s is the start frame in the frame buffer 124 to search for current cycle. Initially, when beginning the process 300 , f s refers to the first frame in the frame buffer 124 . f s is updated for each iteration as illustrated in the example diagram 400 .
  • DTW dynamic time warping
  • DTW is a technique for comparing sequences that may vary in speed by computing an optimal matching between discreet portions of the sequences.
  • DTW includes computing a “warping path” that corresponds to a mapping between a portion of one sequence and a similar portion of another sequence occurring at a different time or rate. It should be noted that use of DTW is merely one example, and other approaches for determining a distance between two cycles may also be used.
  • the cycle end frame may be determined by identifying a local minima of the error values of each frame. Thus, multiple end frames may be found. To limit the final number of end frames, a maximum variance between the local minima may be defined. The cycle is found once the cycle end frame is determined.
  • the example diagram 400 illustrates two cycles c 1 3 and c 1 4 as result of first iteration. Meaning, in the first iteration a candidate cycle is indicated as being the first through third frames of the frame buffer 124 , and another candidate cycle is indicated as being the first through fourth frames of the frame buffer 124 .
  • the example diagram 400 further illustrates five cycles as a result of a second iteration. Assuming the candidate cycle c 1 3 , two cycles c 4 6 and c 4 7 are shown as a result of the second iteration. Assuming the candidate cycle c 1 4 , three cycles c 5 7 , c 5 8 , and c 5 9 are shown as a result of the second iteration.
  • the iteration to search for the next cycle may continue until the number of iterations exceeds a predefined value or until the frame buffer 124 lacks sufficient frames to form a cycle.
  • FIG. 5 illustrates an example diagram 500 of a calculation of segmentation error.
  • the segmentation includes two cycles, i.e., c 1 3 and c 4 6 .
  • the error of the segmentation is calculated as the sum of d(c 1 3 ,c r ) and d(c 4 6 ,c r ).
  • the error of the segmentation is calculated as the maximum value of d(c 1 3 ,c r ) and d(c 4 6 ,c r ). It should be noted that these are merely examples, and other approaches for the calculation of segmentation error may be used.
  • FIG. 6 illustrates an example diagram 600 of a determination of cycles from all of the calculated segmentations.
  • the segmentations generated as result of the approach illustrated in FIG. 4 are sorted based on the segmentation error, which may be calculated according to the approach illustrated in FIG. 5 .
  • the first cycle of the segmentation i.e., the one having the minimum segmentation error is determined as the result.
  • the processor 102 generates cycle information for the detected cycle and removes frames of the detected cycle from the frame buffer 124 . Since frame data of the frame buffer 124 contains timestamp information, once the cycle is detected, cycle duration data 126 may be determined by calculating the elapsed time between the first frame and the last frame of the cycle.
  • FIG. 7 illustrates an example of preparation of the frame buffer 124 for detection of the next cycle.
  • the frames of the detected cycle are removed from the frame buffer 124 .
  • the system 100 is used in an online system to process streaming data, detecting cycles and capturing new data may be performed concurrently. In such a situation, the frame buffer 124 may be updated dynamically during runtime. As shown in FIG. 7 , two new frames are added to the frame buffer 124 from streaming data. Responsive to the updated buffer containing enough frames to form one cycle, the processing to detect an additional cycle may be repeated.
  • the processor 102 determines whether the system 100 is operating in an online mode.
  • the processor 102 may identify from a setting stored to the memory 110 that the system 100 is operating in an online mode. Additionally or alternately, the processor 102 may identify that the system 100 is operating in an online mode according to connection of the processor 102 to a source of motion capture data 114 , such as via a motion capture device 118 or over a network, e.g., from an input device 116 received data over a network from a motion capture device 118 or data storage. If the system 100 is not operating in online mode, control passes to operation 310 . If the system is operating in online mode, control passes to operation 314 .
  • the processor 102 determines whether the frame buffer 124 contains sufficient frames for one cycle.
  • the processor 102 may verify that the frame buffer 124 contains at least a predefined threshold number of frames (e.g., at least one frame, at least a predefined number of frames, at least the number of frames of a defined quantity of average cycles, at least the number of frames of the shortest cycle, at least a percentage amount of the length in frames of the reference activity sequence 120 , etc.) If the frame buffer 124 contains sufficient frames, control passes to operation 304 to detect an additional cycle. Otherwise, control passes to operation 312 .
  • a predefined threshold number of frames e.g., at least one frame, at least a predefined number of frames, at least the number of frames of a defined quantity of average cycles, at least the number of frames of the shortest cycle, at least a percentage amount of the length in frames of the reference activity sequence 120 , etc.
  • the processor 102 outputs the duration of the detected cycles.
  • the cycle duration data 126 indicative of the durations of the detected cycles includes the elapsed time between the first frame and the last frame of the cycle.
  • the processor 102 determines whether streaming is active.
  • the processor 102 may identify whether the additional motion capture data 114 has been received (e.g., from the motion capture device 118 , from a network connection, from a data storage, etc.). Additionally or alternately, the processor 102 may identify that the system 100 streaming is active according to continued connection of the processor 102 to a source of motion capture data 114 , such as continued connection to the motion capture device 118 , network, input device 116 , or data storage. If streaming is active, control passes to operation 316 . Otherwise control passes to operation 312 .
  • the processor 102 determines whether the frame buffer 124 contains sufficient frames for one cycle, similar to as discussed above with respect to operation 310 . If so, control passes to operation 304 to detect an additional cycle. If not, control passes to operation 314 to determine whether streaming is active.
  • the described approach provides a scalable solution to measure cycle duration in repeated human physical activities from IMU data, for repeated activity sequences with high standardization level, as well as repeated activity sequences with abnormal activities.
  • the size of the frame buffer 124 may be adapted based on the quality of the results being produced by the approach. If the system 100 is achieving good results in identification of results with the frame buffer 124 , then the length of the frame buffer 124 may be reduced to reduce latency in the determination of cycles. However, if the system 100 is achieving poor results in identification of results with the frame buffer 124 , then the length of the frame buffer 124 may be increased to improve accuracy, but with a potential increase in latency.
  • the processor 102 may increase a length in frames of the frame buffer 124 responsive to the segmentation error for the detected cycle being above a predefined threshold error value, and/or the processor 102 may decrease the length in frames of the frame buffer 124 responsive to the segmentation error for the detected cycle being below a predefined threshold error value.
  • the processes, methods, or algorithms disclosed herein can be deliverable to/implemented by a processing device, controller, or computer, which can include any existing programmable electronic control unit or dedicated electronic control unit.
  • the processes, methods, or algorithms can be stored as data and instructions executable by a controller or computer in many forms including, but not limited to, information permanently stored on non-writable storage media such as ROM devices and information alterably stored on writeable storage media such as floppy disks, magnetic tapes, CDs, RAM devices, and other magnetic and optical media.
  • the processes, methods, or algorithms can also be implemented in a software executable object.
  • the processes, methods, or algorithms can be embodied in whole or in part using suitable hardware components, such as Application Specific Integrated Circuits (ASICs), Field-Programmable Gate Arrays (FPGAs), state machines, controllers or other hardware components or devices, or a combination of hardware, software and firmware components.
  • suitable hardware components such as Application Specific Integrated Circuits (ASICs), Field-Programmable Gate Arrays (FPGAs), state machines, controllers or other hardware components or devices, or a combination of hardware, software and firmware components.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • General Engineering & Computer Science (AREA)
  • Computer Hardware Design (AREA)
  • Image Analysis (AREA)

Abstract

Using a global optimization, a cycle within a frame buffer including frames corresponding to one or more cycles of query activity sequences is detected. The detection includes creating a plurality of cycle segmentations by recursively iterating through the frame buffer to identify candidate cycles corresponding to cycles of a reference activity sequence until the frame buffer lacks sufficient frames to create additional cycles, computing segmentation errors for each of the plurality of cycle segmentations, and identifying the detected cycle as the one of the plurality of cycle segmentations having a lowest segmentation error. Cycle duration data for the detected cycle is generated. Frames belonging to the detected cycle are removed from the frame buffer. The cycle duration data is output.

Description

TECHNICAL FIELD
The present disclosure relates to systems and methods for measurement of cycle duration in repeated activity sequences.
BACKGROUND
Aspects of human movement can have a large impact on, for example, the way a tool is designed, the way a workspace is laid out, or the way a task is performed. Understanding how the human body can move and interact with objects and the environment can result in tools that are more ergonomic, workspaces that are more efficient to navigate, and tasks that more intuitive to perform. The range of possible human motions and gestures is vast, however, and simple tasks, such as lifting a cup, pointing in a direction, or turning a screw, often result from a complex set of biomechanical interactions. This relation of simple result from complex movement can make human motions and gestures extremely difficult to quantify or understand in a meaningful or practical way.
SUMMARY
In one or more illustrative examples, a system for measurement of cycle duration in repeated activity sequences includes a display device, a memory configured to store a motion analysis application and motion capture data including a reference activity sequence and a frame buffer including frames corresponding to one or more cycles of query activity sequences, and a processor, operatively connected to the memory and the display device. The processor is configured to execute the motion analysis application to detect a cycle within the frame buffer using a global optimization, including to create a plurality of cycle segmentations by recursively iterating through the frame buffer to identify candidate cycles until the frame buffer lacks sufficient frames to create additional cycles, compute segmentation errors for each of the plurality of cycle segmentations, and identify the detected cycle as the one of the plurality of cycle segmentations having a lowest segmentation error, generate cycle duration data for the detected cycle, remove frames belonging to the detected cycle from the frame buffer, and output the cycle duration data to the display device.
In one or more illustrative examples, a method for measurement of cycle duration in repeated activity sequences includes detecting, using a global optimization, a cycle within a frame buffer including frames corresponding to one or more cycles of query activity sequences, the detecting including creating a plurality of cycle segmentations by recursively iterating through the frame buffer to identify candidate cycles corresponding to cycles of a reference activity sequence until the frame buffer lacks sufficient frames to create additional cycles, computing segmentation errors for each of the plurality of cycle segmentations, and identifying the detected cycle as the one of the plurality of cycle segmentations having a lowest segmentation error, generating cycle duration data for the detected cycle; removing frames belonging to the detected cycle from the frame buffer; and outputting the cycle duration data to the display device.
In one or more illustrative examples, non-transitory computer readable medium comprising instructions of a motion analysis application that, when executed by one or more processors, cause the one or more processors to detect, using a global optimization, a cycle within a frame buffer including frames corresponding to one or more cycles of query activity sequences, the detecting including to create a plurality of cycle segmentations by recursively iterating through the frame buffer to identify candidate cycles corresponding to cycles of a reference activity sequence until the frame buffer lacks sufficient frames to create additional cycles, compute segmentation errors for each of the plurality of cycle segmentations, and identify the detected cycle as the one of the plurality of cycle segmentations having a lowest segmentation error, generate cycle duration data for the detected cycle; remove frames belonging to the detected cycle from the frame buffer; continue to detect additional cycles within the frame buffer using the global optimization until the frame buffer lacks sufficient cycles for an additional cycle; and output the cycle duration data.
BRIEF DESCRIPTION OF THE DRAWINGS
FIG. 1 illustrates an example system for measurement of cycle duration in repeated activity sequences;
FIG. 2 illustrates an example of a cycle of a reference activity sequence and a frame buffer;
FIG. 3 illustrates an example process for measurement of cycle duration in repeated activity sequences;
FIG. 4 illustrates an example diagram of a cycle segmentation search within the frame buffer;
FIG. 5 illustrates an example diagram of a calculation of segmentation error;
FIG. 6 illustrates an example diagram of a determination of cycles from the calculated segmentations; and
FIG. 7 illustrates an example of preparation of the frame buffer for detection of the next cycle.
DETAILED DESCRIPTION
Embodiments of the present disclosure are described herein. It is to be understood, however, that the disclosed embodiments are merely examples and other embodiments can take various and alternative forms. The figures are not necessarily to scale; some features could be exaggerated or minimized to show details of particular components. Therefore, specific structural and functional details disclosed herein are not to be interpreted as limiting, but merely as a representative basis for teaching one skilled in the art to variously employ the embodiments. As those of ordinary skill in the art will understand, various features illustrated and described with reference to any one of the figures can be combined with features illustrated in one or more other figures to produce embodiments that are not explicitly illustrated or described. The combinations of features illustrated provide representative embodiments for typical applications. Various combinations and modifications of the features consistent with the teachings of this disclosure, however, could be desired for particular applications or implementations.
In industry, it is an important yet challenging to measure cycle duration in repeated physical human activities based on inertial measurement unit (IMU) data. For example, the productivity level of the assembly line in a factory can be improved based on accurate measurement of repeated actions of operators. However, manual measurement of repeated cycles is labor intensive. Hence, it is necessary to develop a solution to measure the cycle duration automatically. Some systems perform real-time detection of cycles from repeated activities based on pre-recorded standardized operations. However, such systems are best suited to repeated activities with high standardization level, but are less suited to measuring cycle duration of repeated activities with many abnormalities. The high standardization level means that there are fewer motion differences in terms of orientation and speed between the query activities and the recorded standardized cycle.
This disclosure proposes a solution to measure cycle duration in repeated physical human activities using the time-series data from IMUs, with improved performance. In contrast to some systems which operate in a purely real-time manner, the proposed approach measures cycle duration in repeated activity sequences streamed into or otherwise stored into a buffer. The proposed approach utilizes global optimization of all activities stored in the buffer with the aim to minimize the total error of all cycle durations. Hence, the cycle duration measurement accuracy is improved dramatically. The accuracy and the response delay are positively correlated with the buffer size. For highly standardized repeated activities, a relatively smaller buffer size could still achieve the accuracy expectation. However, the response delay is reduced with the smaller buffer size. Further aspects of the disclosure are discussed in detail below.
FIG. 1 is a schematic diagram of an exemplary embodiment of a system 100 for measurement of cycle duration in repeated activity sequences. The system 100 may quantitatively compute an accuracy, e.g., a deviation from a prescribed motion or gesture and deviation from a target time period for completion, of a movement. The system 100 includes a processor 102 that is operatively connected to a memory 110, input device 116, motion capture device 118, and a display device 108. As is described in more detail below, during operation, the system 100 (i) detects, using a global optimization, a cycle within a frame buffer including frames corresponding to one or more cycles of query activity sequences, the detecting including (a) creating a plurality of cycle segmentations by recursively iterating through the frame buffer to identify candidate cycles corresponding to cycles of the reference activity sequence until the frame buffer lacks sufficient frames to create additional cycles, (b) computing segmentation errors for each of the plurality of cycle segmentations, and (c) identifying the detected cycle as the one of the plurality of cycle segmentations having a lowest segmentation error, (ii) generating cycle duration data for the detected cycle; (iii) removing frames belonging to the detected cycle from the frame buffer; and (iv) outputting the cycle duration data.
In the system 100, the processor 102 includes one or more integrated circuits that implement the functionality of a central processing unit (CPU) 104 and graphics processing unit (GPU) 106. In some examples, the processor 102 is a system on a chip (SoC) that integrates the functionality of the CPU 104 and GPU 106, and optionally other components including, for example, the memory 110, a network device, and a positioning system, into a single integrated device. In other examples the CPU 104 and GPU 106 are connected to each other via a peripheral connection device such as PCI express or another suitable peripheral data connection. In one example, the CPU 104 is a commercially available central processing device that implements an instruction set such as one of the x86, ARM, Power, or MIPS instruction set families.
The GPU 106 may include hardware and software for display of at least two-dimensional (2D) and optionally three-dimensional (3D) graphics to a display device 108. The display device 108 may include an electronic display screen, projector, printer, or any other suitable device that reproduces a graphical display. In some examples, processor 102 executes software programs including drivers and other software instructions using the hardware functionality in the GPU 106 to accelerate generation and display of the graphical depictions of models of human movement and visualizations of quantitative computations that are described herein
During operation, the CPU 104 and GPU 106 execute stored program instructions that are retrieved from the memory 110. The stored program instructions include software that control the operation of the CPU 104 and the GPU 106 to perform the operations described herein.
While FIG. 1 depicts the processor 102 as including both the CPU 104 and GPU 106, alternative embodiments may omit the GPU 106, as for example the processor 102 may be of a server that generates output visualization data using only a CPU 104 and transmits the output visualization data to a remote client computing device that uses a GPU 106 and a display device 108 to display the data. Additionally, alternative embodiments of the processor 102 can include microcontrollers, application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs), digital signal processors (DSPs), or any other suitable digital logic devices in addition to or as replacements of the CPU 104 and GPU 106.
In the system 100, the memory 110 includes both non-volatile memory and volatile memory devices. The non-volatile memory includes solid-state memories, such as NAND flash memory, magnetic and optical storage media, or any other suitable data storage device that retains data when the system 100 is deactivated or loses electrical power. The volatile memory includes static and dynamic random-access memory (RAM) that stores program instructions and data, including a motion analysis application 112 and motion capture data 114, during operation of the system 100. In some embodiments the CPU 104 and the GPU 106 each have access to separate RAM devices (e.g., a variant of DDR SDRAM for the CPU 104 and a variant of GDDR, HBM, or other RAM for the GPU 106) while in other embodiments the CPU 104 and GPU 106 access a shared memory device. The memory 110 may store the motion capture data 114 and motion analysis application 112 for maintenance and retrieval.
The input device 116 may include any of various devices that enable the system 100 to receive the motion capture data 114 and motion analysis application 112. Examples of suitable input devices include human interface inputs such as keyboards, mice, touchscreens, voice input devices, and the like, as well. In some examples the system 100 implements the input device 116 as a network adapter or peripheral interconnection device that receives data from another computer or external data storage device, which can be useful for receiving large sets of motion capture data 114 in an efficient manner.
The motion analysis application 112 includes instructions that, when executed by the processor 102 of the system 100, cause the system 100 to perform the processes and operations described herein. These processes and operations include (i) detects, using a global optimization, a cycle within a frame buffer 124 including frames corresponding to one or more cycles of query activity sequences, the detecting including (a) creating a plurality of cycle segmentations by recursively iterating through the frame buffer 124 to identify candidate cycles corresponding to cycles of the reference activity sequence until the frame buffer 124 lacks sufficient frames to create additional cycles, (b) computing segmentation errors for each of the plurality of cycle segmentations, and (c) identifying the detected cycle as the one of the plurality of cycle segmentations having a lowest segmentation error, (ii) generating cycle duration data 126 for the detected cycle; (iii) removing frames belonging to the detected cycle from the frame buffer; and (iv) outputting the cycle duration data 126.
The motion capture data 114 refers to a plurality of records representative of the locations of at least one tracked item or portion of the item over time. For example, the motion capture data 114 may include one or more of: records of positions of a reference point on a body part over time or at set time intervals, sensor data taken over time, a video stream or a video stream that has been processed using a computer-vision technique, data indicative of the operating state of a machine over time, etc. In some cases, the motion capture data 114 may include data representative of more than one continuous movement. For instance, the motion capture data 114 may include a combination of a plurality of combined motion capture data 114 sets.
A motion capture device 118 is a device configured to generate motion capture data 114. Motion capture devices 118 may include, as some non-limiting examples: cameras, visual sensors, infra-red sensors, ultrasonic sensors, accelerometers, pressure sensors, or the like. One non-limiting example of a motion capture device 118 is one or a pair of digital gloves that a user wears while performing cyclical motions. The digital gloves may include sensors that capture the motions of the user to generate the motion capture data 114 that may be stored in the memory 110 or elsewhere to a data storage device.
A movement is an action performed by an operator. A reference movement refers to a baseline or canonical version of the movement. The reference movement may be used as a standard of comparison for other movements, to allow for identification of how close the other movements are to the reference movement.
The motion capture data 114 may be generally classified into one of two categories for the purpose of computing the accuracy of a human motion: a reference activity sequence 120 that includes data representative of the reference or baseline movement, and a query activity sequence 122 that includes data representative of the test movement, i.e., a movement to be compared and quantitatively evaluated for accuracy relative to the baseline movement.
The reference activity sequence 120 may include motion capture data 114 received from the motion capture device 118. The data may also include processed movement data, such as, for example, frame, step, cycle, and time information gleaned from the raw movement data. In one example, the reference movement is represented as a reference activity sequence 120 having an ordered set of frames that each includes motion capture data 114 corresponding to a respective interval of time of the reference movement.
The query activity sequence 122 may also include motion capture data 114 received from the motion capture device 118. In some examples, a movement or movements in the motion capture data 114 includes a label or labels classifying the movements as either reference movements or test movements. The motion analysis application 112 may be programmed is configured to receive instruction for classifying a movement or movements as reference movements or test movements, such as from a user via the input device 116 or from another source.
The motion analysis application 112 may also be programmed to separate a received movement into frames, whereby a “frame” corresponds to a discrete interval of time. In other words, each frame of a movement includes a portion of the motion capture data 114 corresponding to a portion of the movement occurring during a respective interval of the timeline for that movement. In some examples, the duration for the interval corresponding to an individual frame is preset. In some examples, the duration for the interval corresponding to an individual frame is set based on an instruction received from, for example, the user via the input device 116 or another source. In some examples, the duration for the interval corresponding to an individual frame is set with reference to one or more characteristics of the motion capture data 114. For example, in some embodiments, the duration for the interval corresponding to an individual frame is set with reference to one or more of a duration of a reference movement, a total travel distance for the movement, a number of individual motions or gestures within the movement, a speed of the movement, etc. Generally, the same interval for the duration of frames is used for both a reference movement and for test movements to be evaluated relative to the reference movement.
A frame buffer 124 is a data storage configured to hold frames of one or more query activity sequences 122. FIG. 2 illustrates an example of a cycle of a reference activity sequence 120 and a frame buffer 124. As shown, the reference activity sequence 120 includes four frames, while the frame buffer 124 includes repeated query activity sequences 122 with ten frames. The frame buffer 124 may contain all sequences frames if the system 100 is used offline to process saved query activity sequence 122 data. The frame buffer 124 may also be updated dynamically at runtime if the system 100 is used in an online mode to process streaming data. A frame includes motion capture data 114 captured from wearable sensors and a timestamp of the captured data. If the wearable sensors are IMU sensors of a motion capture device 118 (such as a glove or smart clothing), the captured data may be received from accelerometer, gyro, and/or magnetometer sensors.
In some instances, motion capture data 114 may be received as a file of stored motion capture data from a data storage device. In such instances, the motion analysis application 112 may separate the movement or movements in the motion capture data 114 into frames to be added to the frame buffer 124 for further processing.
Referring back to FIG. 1, the cycle duration data 126 may include information indicative of the durations of the cycles in the frame buffer 124. This information may include, in an example, the elapsed time between the first frame and the last frame of the cycle within the frame buffer 124. In an example, the cycle duration data 126 may be displayed to the display device 108. As discussed in greater detail below, the system 100 may be configured to perform measurement of cycle duration of motion capture data 114 in the frame buffer 124 descriptive of repeated query activity sequences 122.
The motion analysis application 112 may also map frames of the test movement to corresponding frames of the reference movement. The cycle duration data 126 may be useful in identifying the boundaries of each of the cycles of query activity sequences 122 for comparison with the reference activity sequence 120. In some examples, the test movement and reference movement are synchronized so that frames of the test movement are mapped to frames of the reference movement that correspond temporally, and in some embodiments, the test movement and the reference movement are aligned in terms of gestures and motions within the movement, such that frames of the test movement are mapped to frames of the reference movement that correspond with regard to the sequence of motions and/or gestures performed in the movement. The motion analysis application 112 may further compute an accuracy of the test movement or test movements relative to the reference movement.
FIG. 3 illustrates an example process 300 for measurement of cycle duration in repeated activity sequences. For each iteration, one cycle is detected from the frame buffer 124, the cycle information is generated, and the frames belonging to detected cycle are removed from the frame buffer 124. Such iteration continues until the frame buffer 124 lacks sufficient frames to form an additional cycle. The process 200 may be performed by the processor 102 of the system 100, as discussed in more detail below.
At 302, the processor 102 receives the reference activity sequence 120 and the frame buffer 124 containing repeated query activity sequences 122. In an example, the reference activity sequence 120 may be retrieved from the memory 110. The frame buffer 124 may include one or both of previously-recorded data from the motion capture device 118 and dynamic data streamed at real-time into the frame buffer 124 from the motion capture device 118.
At operation 304, the processor 102 detects one cycle from the frame buffer 124 using a global optimization. FIG. 4 illustrates an example diagram 400 of a cycle segmentation search within the frame buffer 124. Each iteration of detection of a cycle finds a cycle that matches the cycle of the reference activity sequence 120 from the remaining data in the frame buffer 124. In other words, to find a cycle in the frame buffer 124 is equivalent to finding a cycle end frame from the remaining data in the frame buffer 124. One method of finding the cycle end frame is first to calculate an error value for each frame. In an example, Equation (1) may be used to calculate the frame error value, as follows:
e(f i)=d(c s i ,c r);  (1)
where
fi denotes the frame at index i in the buffer;
cs i denotes the cycle consisting of frames from frame fs to frame fi in buffer;
cr denotes the reference cycle of the reference activity sequence 120; and
d(cs i,cr) denotes the distance between two cycles.
fs is the start frame in the frame buffer 124 to search for current cycle. Initially, when beginning the process 300, fs refers to the first frame in the frame buffer 124. fs is updated for each iteration as illustrated in the example diagram 400.
An approach to calculate the distance between two cycles is to apply dynamic time warping (DTW) to map two cycles, and then identify the mapping error as the distance of the two cycles. DTW is a technique for comparing sequences that may vary in speed by computing an optimal matching between discreet portions of the sequences. DTW includes computing a “warping path” that corresponds to a mapping between a portion of one sequence and a similar portion of another sequence occurring at a different time or rate. It should be noted that use of DTW is merely one example, and other approaches for determining a distance between two cycles may also be used.
Responsive to calculating the error value of each frame, the cycle end frame may be determined by identifying a local minima of the error values of each frame. Thus, multiple end frames may be found. To limit the final number of end frames, a maximum variance between the local minima may be defined. The cycle is found once the cycle end frame is determined.
The example diagram 400 illustrates two cycles c1 3 and c1 4 as result of first iteration. Meaning, in the first iteration a candidate cycle is indicated as being the first through third frames of the frame buffer 124, and another candidate cycle is indicated as being the first through fourth frames of the frame buffer 124. The example diagram 400 further illustrates five cycles as a result of a second iteration. Assuming the candidate cycle c1 3, two cycles c4 6 and c4 7 are shown as a result of the second iteration. Assuming the candidate cycle c1 4, three cycles c5 7, c5 8, and c5 9 are shown as a result of the second iteration. The iteration to search for the next cycle may continue until the number of iterations exceeds a predefined value or until the frame buffer 124 lacks sufficient frames to form a cycle.
FIG. 5 illustrates an example diagram 500 of a calculation of segmentation error. As shown, the segmentation includes two cycles, i.e., c1 3 and c4 6. In one implementation, the error of the segmentation is calculated as the sum of d(c1 3,cr) and d(c4 6,cr). In another implementation, the error of the segmentation is calculated as the maximum value of d(c1 3,cr) and d(c4 6,cr). It should be noted that these are merely examples, and other approaches for the calculation of segmentation error may be used.
FIG. 6 illustrates an example diagram 600 of a determination of cycles from all of the calculated segmentations. In an example, the segmentations generated as result of the approach illustrated in FIG. 4 are sorted based on the segmentation error, which may be calculated according to the approach illustrated in FIG. 5. The first cycle of the segmentation, i.e., the one having the minimum segmentation error is determined as the result.
At operation 306, the processor 102 generates cycle information for the detected cycle and removes frames of the detected cycle from the frame buffer 124. Since frame data of the frame buffer 124 contains timestamp information, once the cycle is detected, cycle duration data 126 may be determined by calculating the elapsed time between the first frame and the last frame of the cycle.
FIG. 7 illustrates an example of preparation of the frame buffer 124 for detection of the next cycle. In an example, the frames of the detected cycle are removed from the frame buffer 124. If the system 100 is used in an online system to process streaming data, detecting cycles and capturing new data may be performed concurrently. In such a situation, the frame buffer 124 may be updated dynamically during runtime. As shown in FIG. 7, two new frames are added to the frame buffer 124 from streaming data. Responsive to the updated buffer containing enough frames to form one cycle, the processing to detect an additional cycle may be repeated.
Accordingly, at operation 308, the processor 102 determines whether the system 100 is operating in an online mode. In an example, the processor 102 may identify from a setting stored to the memory 110 that the system 100 is operating in an online mode. Additionally or alternately, the processor 102 may identify that the system 100 is operating in an online mode according to connection of the processor 102 to a source of motion capture data 114, such as via a motion capture device 118 or over a network, e.g., from an input device 116 received data over a network from a motion capture device 118 or data storage. If the system 100 is not operating in online mode, control passes to operation 310. If the system is operating in online mode, control passes to operation 314.
At operation 310, the processor 102 determines whether the frame buffer 124 contains sufficient frames for one cycle. In an example, the processor 102 may verify that the frame buffer 124 contains at least a predefined threshold number of frames (e.g., at least one frame, at least a predefined number of frames, at least the number of frames of a defined quantity of average cycles, at least the number of frames of the shortest cycle, at least a percentage amount of the length in frames of the reference activity sequence 120, etc.) If the frame buffer 124 contains sufficient frames, control passes to operation 304 to detect an additional cycle. Otherwise, control passes to operation 312.
At operation 312, the processor 102 outputs the duration of the detected cycles. In an example, the cycle duration data 126 indicative of the durations of the detected cycles includes the elapsed time between the first frame and the last frame of the cycle. After operation 312, the process 300 ends.
At operation 314, the processor 102 determines whether streaming is active. In an example, the processor 102 may identify whether the additional motion capture data 114 has been received (e.g., from the motion capture device 118, from a network connection, from a data storage, etc.). Additionally or alternately, the processor 102 may identify that the system 100 streaming is active according to continued connection of the processor 102 to a source of motion capture data 114, such as continued connection to the motion capture device 118, network, input device 116, or data storage. If streaming is active, control passes to operation 316. Otherwise control passes to operation 312.
At operation 316, the processor 102 determines whether the frame buffer 124 contains sufficient frames for one cycle, similar to as discussed above with respect to operation 310. If so, control passes to operation 304 to detect an additional cycle. If not, control passes to operation 314 to determine whether streaming is active.
Thus, the described approach provides a scalable solution to measure cycle duration in repeated human physical activities from IMU data, for repeated activity sequences with high standardization level, as well as repeated activity sequences with abnormal activities.
As a further enhancement, the size of the frame buffer 124 may be adapted based on the quality of the results being produced by the approach. If the system 100 is achieving good results in identification of results with the frame buffer 124, then the length of the frame buffer 124 may be reduced to reduce latency in the determination of cycles. However, if the system 100 is achieving poor results in identification of results with the frame buffer 124, then the length of the frame buffer 124 may be increased to improve accuracy, but with a potential increase in latency. For instance, the processor 102 may increase a length in frames of the frame buffer 124 responsive to the segmentation error for the detected cycle being above a predefined threshold error value, and/or the processor 102 may decrease the length in frames of the frame buffer 124 responsive to the segmentation error for the detected cycle being below a predefined threshold error value.
The processes, methods, or algorithms disclosed herein can be deliverable to/implemented by a processing device, controller, or computer, which can include any existing programmable electronic control unit or dedicated electronic control unit. Similarly, the processes, methods, or algorithms can be stored as data and instructions executable by a controller or computer in many forms including, but not limited to, information permanently stored on non-writable storage media such as ROM devices and information alterably stored on writeable storage media such as floppy disks, magnetic tapes, CDs, RAM devices, and other magnetic and optical media. The processes, methods, or algorithms can also be implemented in a software executable object. Alternatively, the processes, methods, or algorithms can be embodied in whole or in part using suitable hardware components, such as Application Specific Integrated Circuits (ASICs), Field-Programmable Gate Arrays (FPGAs), state machines, controllers or other hardware components or devices, or a combination of hardware, software and firmware components.
While exemplary embodiments are described above, it is not intended that these embodiments describe all possible forms encompassed by the claims. The words used in the specification are words of description rather than limitation, and it is understood that various changes can be made without departing from the spirit and scope of the disclosure. As previously described, the features of various embodiments can be combined to form further embodiments of the invention that may not be explicitly described or illustrated. While various embodiments could have been described as providing advantages or being preferred over other embodiments or prior art implementations with respect to one or more desired characteristics, those of ordinary skill in the art recognize that one or more features or characteristics can be compromised to achieve desired overall system attributes, which depend on the specific application and implementation. These attributes can include, but are not limited to cost, strength, durability, life cycle cost, marketability, appearance, packaging, size, serviceability, weight, manufacturability, ease of assembly, etc. As such, to the extent any embodiments are described as less desirable than other embodiments or prior art implementations with respect to one or more characteristics, these embodiments are not outside the scope of the disclosure and can be desirable for particular applications.

Claims (20)

What is claimed is:
1. A system for measurement of cycle duration in repeated activity sequences, comprising:
a display device;
a memory configured to store a motion analysis application and motion capture data including a reference activity sequence and a frame buffer including frames corresponding to one or more cycles of query activity sequences; and
a processor, operatively connected to the memory and the display device, and configured to execute the motion analysis application to
detect a cycle within the frame buffer using a global optimization, including to
create a plurality of cycle segmentations by recursively iterating through the frame buffer to identify candidate cycles corresponding to cycles of the reference activity sequence until the frame buffer lacks sufficient frames to create additional cycles,
compute segmentation errors for each of the plurality of cycle segmentations, and
identify the detected cycle as the one of the plurality of cycle segmentations having a lowest segmentation error,
generate cycle duration data for the detected cycle,
remove frames belonging to the detected cycle from the frame buffer, and
output the cycle duration data to the display device.
2. The system of claim 1, wherein the processor is further configured to execute the motion analysis application to continue to detect additional cycles within the frame buffer using the global optimization until the frame buffer lacks sufficient cycles for an additional cycle.
3. The system of claim 1, wherein the frames of the frame buffer include timestamp information, and the processor is further configured to execute the motion analysis application to generate the cycle duration data by calculating an elapsed time between the timestamp information of a first frame of the detected cycle and the timestamp information of a last frame of the detected cycle.
4. The system of claim 1, wherein the processor is further configured to execute the motion analysis application to, responsive to streaming of frame data being active and additional frames being received, append the additional frames to an end of the frame buffer.
5. The system of claim 1, wherein the processor is further configured to execute the motion analysis application to, responsive to streaming of frame data being active and the frame buffer lacking sufficient frames to create additional cycles, wait for additional frames to be received to the processor.
6. The system of claim 1, wherein the processor is further configured to execute the motion analysis application to compute the segmentation errors for each of the plurality of cycle segmentations according to a sum of the segmentation errors of the candidate cycles in the respective cycle segmentation.
7. The system of claim 1, wherein the processor is further configured to execute the motion analysis application to compute the segmentation errors for each of the plurality of cycle segmentations according to a maximum of the segmentation errors of the candidate cycles in the respective cycle segmentation.
8. The system of claim 1, wherein the processor is further configured to execute the motion analysis application to increase a length in frames of the frame buffer responsive to the segmentation error for the detected cycle being above a predefined threshold error value.
9. The system of claim 1, wherein the processor is further configured to execute the motion analysis application to decrease a length in frames of the frame buffer responsive to the segmentation error for the detected cycle being below a predefined threshold error value.
10. A method for measurement of cycle duration in repeated activity sequences, comprising:
detecting, using a global optimization, a cycle within a frame buffer including frames corresponding to one or more cycles of query activity sequences, the detecting including
creating a plurality of cycle segmentations by recursively iterating through the frame buffer to identify candidate cycles corresponding to cycles of a reference activity sequence until the frame buffer lacks sufficient frames to create additional cycles,
computing segmentation errors for each of the plurality of cycle segmentations, and
identifying the detected cycle as the one of the plurality of cycle segmentations having a lowest segmentation error,
generating cycle duration data for the detected cycle;
removing frames belonging to the detected cycle from the frame buffer; and
outputting the cycle duration data.
11. The method of claim 10, further comprising continuing to detect additional cycles within the frame buffer using the global optimization until the frame buffer lacks sufficient cycles for an additional cycle.
12. The method of claim 10, wherein the frames of the frame buffer include timestamp information, and further comprising generating the cycle duration data by calculating an elapsed time between the timestamp information of a first frame of the detected cycle and the timestamp information of a last frame of the detected cycle.
13. The method of claim 10, further comprising, responsive to streaming of frame data being active and additional frames being received, appending the additional frames to an end of the frame buffer.
14. The method of claim 10, further comprising, responsive to streaming of frame data being active and the frame buffer lacking sufficient frames to create additional cycles, waiting for additional frames to be received.
15. The method of claim 10, further comprising computing the segmentation errors for each of the plurality of cycle segmentations according to a sum of the segmentation errors of the candidate cycles in the respective cycle segmentation.
16. The method of claim 10, further comprising computing the segmentation errors for each of the plurality of cycle segmentations according to a maximum of the segmentation errors of the candidate cycles in the respective cycle segmentation.
17. The method of claim 10, further comprising increasing a length in frames of the frame buffer responsive to the segmentation error for the detected cycle being above a predefined threshold error value.
18. The method of claim 10, further comprising decreasing a length in frames of the frame buffer responsive to the segmentation error for the detected cycle being below a predefined threshold error value.
19. A non-transitory computer readable medium comprising instructions of a motion analysis application that, when executed by one or more processors, cause the one or more processors to:
detect, using a global optimization, a cycle within a frame buffer including frames corresponding to one or more cycles of query activity sequences, the detection including to
create a plurality of cycle segmentations by recursively iterating through the frame buffer to identify candidate cycles corresponding to cycles of a reference activity sequence until the frame buffer lacks sufficient frames to create additional cycles,
compute segmentation errors for each of the plurality of cycle segmentations, and
identify the detected cycle as the one of the plurality of cycle segmentations having a lowest segmentation error,
generate cycle duration data for the detected cycle;
remove frames belonging to the detected cycle from the frame buffer;
continue to detect additional cycles within the frame buffer using the global optimization until the frame buffer lacks sufficient cycles for an additional cycle; and
output the cycle duration data.
20. The medium of claim 19, further comprising instructions of the motion analysis application that, when executed by one or more processors, cause the one or more processors to, responsive to streaming of frame data being active and additional frames being received concurrent to the processing of the detecting of cycles using the global optimization, append the additional frames to an end of the frame buffer.
US16/236,753 2018-12-31 2018-12-31 System and method for cycle duration measurement in repeated activity sequences Active 2040-01-03 US10996235B2 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US16/236,753 US10996235B2 (en) 2018-12-31 2018-12-31 System and method for cycle duration measurement in repeated activity sequences
DE102019219479.6A DE102019219479A1 (en) 2018-12-31 2019-12-12 SYSTEM AND METHOD FOR MEASURING THE CYCLE IN REPEATED ACTIVITY SEQUENCES
CN201911395387.4A CN111385813B (en) 2018-12-31 2019-12-30 System and method for loop duration measurement in repeated activity sequences

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US16/236,753 US10996235B2 (en) 2018-12-31 2018-12-31 System and method for cycle duration measurement in repeated activity sequences

Publications (2)

Publication Number Publication Date
US20200209276A1 US20200209276A1 (en) 2020-07-02
US10996235B2 true US10996235B2 (en) 2021-05-04

Family

ID=71079853

Family Applications (1)

Application Number Title Priority Date Filing Date
US16/236,753 Active 2040-01-03 US10996235B2 (en) 2018-12-31 2018-12-31 System and method for cycle duration measurement in repeated activity sequences

Country Status (3)

Country Link
US (1) US10996235B2 (en)
CN (1) CN111385813B (en)
DE (1) DE102019219479A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11733259B2 (en) 2021-01-14 2023-08-22 Robert Bosch Gmbh Methods and system for cycle recognition in repeated activities by identifying stable and repeatable features
JP2024534931A (en) * 2021-09-14 2024-09-26 日本電気株式会社 Apparatus, method and program for determining abnormal behavior during a cycle

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3920967A (en) * 1974-02-22 1975-11-18 Trw Inc Computerized traffic control apparatus
US5210444A (en) * 1991-12-20 1993-05-11 The B. F. Goodrich Company Duty cycle meter
US20070075753A1 (en) * 2005-09-30 2007-04-05 Rachael Parker Duty cycle measurement circuit
US9826343B2 (en) * 2014-05-15 2017-11-21 Lg Electronics Inc. Method and device for transmitting and receiving data by using Bluetooth low energy technology in wireless communication system

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9135586B2 (en) * 2010-10-28 2015-09-15 Sap Se System for dynamic parallel looping of repetitive tasks during execution of process-flows in process runtime
JP2013122443A (en) * 2011-11-11 2013-06-20 Hideo Ando Biological activity measuring method, biological activity measuring device, method for transfer of biological activity detection signal and method for provision of service using biological activity information
WO2015021587A1 (en) * 2013-08-12 2015-02-19 Intel Corporation Techniques for low power image compression and display
US20170032248A1 (en) * 2015-07-28 2017-02-02 Microsoft Technology Licensing, Llc Activity Detection Based On Activity Models

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3920967A (en) * 1974-02-22 1975-11-18 Trw Inc Computerized traffic control apparatus
US5210444A (en) * 1991-12-20 1993-05-11 The B. F. Goodrich Company Duty cycle meter
US20070075753A1 (en) * 2005-09-30 2007-04-05 Rachael Parker Duty cycle measurement circuit
US9826343B2 (en) * 2014-05-15 2017-11-21 Lg Electronics Inc. Method and device for transmitting and receiving data by using Bluetooth low energy technology in wireless communication system

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
ProGlove Smart, Wearable Barcode Scanners, retrieved from https://www.proglove.com/ on Jul. 23, 2019.
Toyoda, M. et al., Pattern discovery in data streams under the time warping distance, The VLDB Journal (Jun. 2013) vol. 22, Issue 3, pp. 295-318 retrieved from https://doi.org/10.1007/s00778-012-0289-3 on May 23, 2019.

Also Published As

Publication number Publication date
US20200209276A1 (en) 2020-07-02
CN111385813B (en) 2023-10-31
DE102019219479A1 (en) 2020-07-02
CN111385813A (en) 2020-07-07

Similar Documents

Publication Publication Date Title
EP3314570B1 (en) Real-time, model-based object detection and pose estimation
US9878447B2 (en) Automated collection and labeling of object data
JP7017689B2 (en) Information processing equipment, information processing system and information processing method
RU2700246C1 (en) Method and system for capturing an object using a robot device
US11199561B2 (en) System and method for standardized evaluation of activity sequences
JP6783713B2 (en) Human behavior estimation system
US10860845B2 (en) Method and system for automatic repetitive step and cycle detection for manual assembly line operations
JP6331517B2 (en) Image processing apparatus, system, image processing method, and image processing program
WO2021036373A1 (en) Target tracking method and device, and computer readable storage medium
US9900501B2 (en) Image collecting method and apparatus
JP2016527477A (en) Device location using cameras and wireless signals
KR102203810B1 (en) User interfacing apparatus and method using an event corresponding a user input
TWI431538B (en) Image based motion gesture recognition method and system thereof
US10996235B2 (en) System and method for cycle duration measurement in repeated activity sequences
US11809634B2 (en) Identifying an object in a field of view
JP2015059965A5 (en)
CN106662923A (en) Information processing device, information processing method and program
KR102427158B1 (en) Apparatus for measuring continuous latency of touch controller connected to touch screen display and method therof
JP2020077231A (en) Position detection program, position detection method and position detection device
JP6786015B1 (en) Motion analysis system and motion analysis program
JP5925047B2 (en) Image processing apparatus and image processing method
JP7000941B2 (en) Processing device, 3D model division method and program
US10712725B2 (en) Non-transitory computer-readable storage medium, robot transfer time estimation method, and robot transfer time estimation device
JPWO2021084689A5 (en) Information processing system, information processing device, information processing method and program
CN115393393A (en) Multi-sensor fusion obstacle tracking method, device, equipment and medium

Legal Events

Date Code Title Description
AS Assignment

Owner name: ROBERT BOSCH GMBH, GERMANY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:ZOU, LINCAN;REN, LIU;ZHANG, CHENG;REEL/FRAME:047875/0188

Effective date: 20181228

FEPP Fee payment procedure

Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

STPP Information on status: patent application and granting procedure in general

Free format text: NOTICE OF ALLOWANCE MAILED -- APPLICATION RECEIVED IN OFFICE OF PUBLICATIONS

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT RECEIVED

STPP Information on status: patent application and granting procedure in general

Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED

STCF Information on status: patent grant

Free format text: PATENTED CASE