Update: The final validation paper has been accepted in the Journal of Sports Engineering and Technology. You can read the full text here.
Introduction
The original version of My Jump Lab (named just My Jump) was released in 2015 and it was the first app to measure jump height using slow motion video-analysis. It’s crazy to think how it has evolved ever-since; in that version, you could only measure jump height using the flight-time method. No force-velocity profiles, no asymmetry tests, no drop jumps, and of course the other apps I released over the years that I integrated into My Jump Lab like My Lift for barbell velocity or My Rom for range of motion measurements didn’t exist. But its foundation was strong: in the first validation paper comparing My Jump to a professional force platform, very high correlations (r > 0.97) and small differences (about 2 cm.) were observed. With the pass of the years, more than 10 independent scientific studies, from different research groups all over the world have replicated those findings with different populations, tests, and versions of the app, and the conclusion is always the same: it works.
Slow motion is extremely useful for human movement analysis, and My Jump Lab use it to manually navigate each frame of the video to accurately detect the take off and landing moments, providing a valid and reliable measure of jump height and derived metrics. However, I have been silently working for years on what I think is the next frontier when using smartphones to test athletes: automatic measurement of performance. The first time I achieved such approach was with barbell velocity tracking. A year ago I released an update for My Jump Lab that used Artificial Intelligence (AI) techniques to measure barbell velocity in real time, like linear transducers do for example.
This real time feedback was game-changer since it dramatically reduced the time of the testing, and allowed to measure important metrics to manage fatigue like velocity loss. A recent study demonstrates how valid and reliable this AI barbell velocity monitoring is in comparison with a GymAware linear transducer. Now was time to take that approach to the more popular test in My Jump Lab: the countermovement jump (CMJ).
Validation study: experimental design
The validation of My Jump Lab to measure CMJ in real time is actually a proof-of-concept case study. A proof-of-concept is a type of research that is designed to explore the potential of a novel method, technology, idea, approach etc., and to propose its feasability for future studies. Considering that this new real time function uses AI to automatically recognize the human body and track its position in screen, and not the standard flight time or take-off velocity method, a proof-of-concept to test how feasible this novel approach is seemed perfect.
So when I created this experimental new feature, I first compared 10 jumps vs another iPhone with the slow motion analysis of My Jump Lab, and the comparison was good. Such a little sample size was of course not enough, so before getting too excited, I went to the lab and did myself 60 jumps while simultaneously measuring with the new AI mode in My Jump Lab and a dual force plate (Hawkin Dynamics). After looking at the R square ( = 0.94), I was allowed to be at least a bit excited, but more was needed. In the original validation study of My Jump, 100 total jumps were compared, and most validation research use around 100-300 jumps. For this experiment and just to be completely sure about the results, I ended up collecting 400 jumps from myself over a period of 6 months to systematically detect the ability of the app to measure my jump in different contexts: more or less fatigued, wearing different clothes and using the force plates in different places. The AI method uses image recognition to detect the human body, so testing different clothes, colors, backgrounds, lighting conditions etc. can challenge the app but on the other hand tests its ability to work on the real world. Thus, after collecting the data (which can be found here for skeptics who want to replicate my results), I run the statistics. These was what I found.
Results: Raw data
The new AI feature in My Jump Lab uses computer vision (image recognition) to create a bounding box around the human that is being captured with the camera, and calculates in pixels its position during the jump. Then, it converts pixels to cm by using the body height of the user’s profile as a calibration factor. When first testing the app, I tested the raw jump height measured with the app vs the force plate. Preliminary results in terms of validity were great: very high correlation (r = 0.97) between instruments were observed.
Then, the Bland-Altman analysis, a widely used technique in validation studies, gave me more insights about the difference between measures. A Bland-Altman plot presents the differences between instruments over the whole range of the variable that was measured. It is basically helpful to see if the difference (in the Y-axis of the graph) is similar over the entire range of scores measured with the instruments (X-axis). Simply put, in this case it helps to understand if the difference between the app and the force plate is systematic (it’s always the same, no matter if you measure jumps of 20cm or 40cm) or proportional (the error is higher or lower depending on the jump height that was measured). In this case, it can clearly be observed that the difference is systematic, which is good, because it’s always very similar.
However, the mean bias between instruments was large and statistically significant: 6cm on average, with the app systematically overestimating the real score. That’s not good. But don’t worry: since the correlation was so high and the error was systematic, the linear regression equation can be used to apply a correction factor to the raw data from the app and try to lower that error of measurements. That’s what I did and these are the final results.
Results: Final corrected data
The regression line basically creates a linear equation that allows to predict the score of one variable whit the score from the other variable. In this case, a linear equation was created (See Fig1.) that allowed to predict the real score (the one detected with the force plate) from the score collected with the raw AI method. Thus, applying that equation to the raw data from the app would theoretically provide new values closer to those actually obtained with the force plate. Final results can be found in Fig3 and Fig4.
As can be seen, a very high correlation is preserved, while the systematic difference between devices was reduced to about 2cm, which was a trivial (effect size = 0.1) non statistically significant difference. To put this final results in context, the average difference between a force plate and My Jump Lab in slow motion is about 1cm, while the difference between a force plate and other alternative technologies like inertial measurement units (IMU) is about 4cm.
Final thoughts
This proof-of-concept clearly highlights the potential of this novel method to measure jump height in real time just with your camera and opens the door to future experiments that could try to replicate these findings. Until someone does (and I certainly hope it happens soon), you can see the raw data yourself and, of course, download the app and try for free for 7-days.
Note: this update is only available for iOS devices. The real time jump monitoring is a beta feature and only works with the CMJ.
Bueno no tengo más que decir que la probaré y una vez lo haga te daré feedback, por lo que leo creo que le servirá de mucho.