Over the holidays, I started thinking that an activity tracker algorithm should be position agnostic, i.e. it shouldn’t make a difference whether you have your arm by your side, over your head, etc. It also shouldn’t matter what the loading on that arm is, and the algorithm should produce the same result whether you’re carrying a dumbbell back to the rack at the gym or simply walking down the street. It is the vibration caused by taking a step, which is transmitted through the body to wherever the tracker is being worn, that is really the key signal. I decided to investigate this.
I picked a nice level spot of sidewalk in SF where I walked 50 steps six times in a row, taking care to make the exact same strides each time by stepping on the regularly spaced expansion joints in the cement. Each time I made the trip, I held my left arm outstretched in one of the following positions:
- Down at my side, perpendicular to the ground
- Above my head, perpendicular to the ground
- In front, parallel to the ground and perpendicular to the plane of my chest
- Behind, parallel to the ground and perpendicular to the plane of my chest (or as close as I could get it)
- Left of me, parallel to the ground and the plane of my chest
- Right, parallel to the ground and the plane of my chest
In case that language isn’t clear, here’s an amazing diagram which shows the the single letter references I’ll use throughout this post to refer to each position:
I also decided I would have a better chance of holding my arm steady if it was weighted, so I choose to hold a 5-pack of Coors Light in my left hand, grabbing onto the empty loop where the missing 6th beer should have been. This put a mass of around 1.8Kg at the tip of my arm, but what I failed to anticipate until I was half way through the experiment was that the 5-pack oscillated back and forth on its plastic collar, potentially introducing undesirable signals. As I have often thought on Saturday and Sunday mornings, Coors Light turned out to be a bad decision. In future experiments, I will choose something less dynamic like a barbell or a book.
As I was pacing off my 50 steps with my arm in each position, I also recorded data from my Flex (located on the same wrist as my tracker) and my iPhone 5s M7 processor (in my left pants pocket). Here are the results:
Of course, the M7 was unaffected by the arm position because it was in my pocket, but both of these devices turned out to be pretty darn accurate in this setting. FULL DISCLOSURE: I lost my old Flex diving on a reef off the cost of Great Abco Island, so the measurements here are coming from a new device versus previous posts. Given the volume at which FitBit is manufacturing , however, I will assume that any variations between the two devices is negligible.
The Raw Data for the 6 Position Trial is plotted below, along with the vector sum of the three axes. I’ve also labeled when the trials are occurring, to help them stand out from some of the other noise that’s in here (like me walking back to the other end of the sidewalk in order to repeat the exact same 50 steps):
I feel pretty good about all these, except for the trial B when the 5-pack was behind my back… This was a really awkward position to maintain, and combined with the oscillating beers makes it a pretty junk signal. I decided to throw it out, and plotted the vector sums for all the other 5 trials bellow:
The next thing I wanted to do was revisit the methodology from an earlier post, and figure out what the average step looked like for the vector sum each of the 5 trials with the tracker in different orientations. Here are the results:
From this plot, we can learn that no matter what orientation the arm is in, a step looks pretty much the same when looking at the vector sum of all three axes. I then added the average of all 5 trials to the plot, and threw in some annotations which I believe will be useful in developing a new step counting algorithm:
As the plot shows, the feature of all these signals that is the most similar is the period of the step and the slope of the signal between 0.9Gs < x < 1.1Gs (see yellow circles). I therefor believe it will be interesting to build an algorithm with the following criteria for what constitutes a step:
- Go from 0.9Gs to 1.1Gs in 1, 2 or 3 samples without any decrease in force (i.e. S1 < S2 < S3)
- Remain above 1.1Gs for at least 5 samples
- Go from 1.1Gs to 0.9Gs in 1, 2 or 3 samples without any increase in force (i.e. S1>S2>S3)
- Remain below 0.9Gs for at least 5 samples
If all of the above criteria are met, a step would then be recorded. Obviously this is a continuous cycle, where you could start counting at either positions 1 or 3. I wrote a script ComplexCounter.m which approximated the above, although I wound up not coding in anything related to the slope because I found I often had no data points in that range due to a low sample rate. I adjusted the algorithm so that in order to count as a step, the signal had to remain above 1.1Gs or below 0.9 Gs for two continuous samples, and generated these results:
The algorithm performed pretty well, even when back tested against a much nosier signal from a previous post. Overall, I think I came away thinking that the right way to approach this problem is through the vector sum of the forces, since the signals look pretty much the same no matter how your wrist / arm are oriented. I also think there is something to the approach I’ve taken of measuring a minimum number of samples above a threshold, but I don’t believe my algorithm is sufficiently robust at present. I need to gather some more real world data to back test it with and potentially make some further tweaks.