Today was mostly all spent on data analysis, as well as some work on the app. On the app front, I realized that the accelerometer data we are collecting really is not what we are looking for: we want to find how that accelerometer data correlates to a physical principle, in our case the characters per second, or cps, of the user. I calculated the cps, and added the option, triggered by a switchable boolean, to pass either this data, or the original raw accelerometer data to the data analyzer. I really do think this is the way to go though, as it is a more concrete value.
I then went back to working on the data analyzer, and decided that I wanted to just redo it, because I realized that our goal with the data analyzer had been fundamentally flawed. We had been creating an analyzer for a single ID, thinking we would eventually update it to “talk” with other users data, but in our current implementation this would be very difficult. Because of this, I plan to simplify the data to just be focused on a single user, and instead of just creating tons of metrics for metrics sake, I will instead be creating only metrics that make sense in the scope of our project, especially as we don’t have a ton of time left, considering we will be running our study next week. Basically, the biggest thing we want to find from each text, is an average z-score for each word, averaged across all users, indicating the deviation of the cps at that word compared to the average cps across the whole text per user. This will give a numerical indication of each word’s difficulty, as well as the associated deviation from which we can derive the certainty of our results. I don’t think it will actually be that difficult to implement, but I will work on it tomorrow. One thing of note is that I have decided to disregard negative cps from the cps per word, because I think the negative cps constitutes part of the second cps instance’s whole. I will make a metric that compares first viewing, second viewing and average viewing against the cps, but not negative viewings.
Oh, and finally, I want to implement a noise filter on the accelerometer data as it is very noisy, but I don’t understand any of the ones I looked up (Kalman filter, what are you?!!?!), so I will try to figure that out tomorrow as well.