I use a chrome extension called Video Speed Controller to set the speed of videos that I watch. When thinking about an “optimal” default setting, I realised this optimal level would ideally be dynamic.
Ideally, I would set this level algorithmically based on data. I could start with an function to output this speed level based on the time of the day. Fairly simple. Fastest in the morning, and dropping off in the evening, as the brain should wind down for sleep.
However, this function mapping inputs to video speed could use many more data inputs than the current time… data that is not just a general heuristic that “in theory” or “in general” represents my current optimal brain processing speed, but truly represents it.
There are many inputs from the world of quantified self that could inform this algorithm.
Location. I already track this - updating every minute - near-enough real-time. I could define geobounded areas that are multipliers on the speed. At work - 1.5x, at home - 0.9x, in public - 1x. One could even classify locations into categories with defined multipliers to work for any location.
Heart-rate. I sort of have access to this with FitBit. Some initial thoughts… not sure how heart rate maps to mental processing speed; not sure if I can get my heart rate data real-time enough to be meaningful. Usage is less obvious to me at this point but possibly high-yielding.
actual neuronal activity. This is a thing. Electroencephalography. Or, EEG for short. The issue is right now you have to wire up electrodes to your skull, even having to lubricate your head with gel to ensure optimal connection. Not quite convenient enough. Maybe one day technology will create something that you can forget is there and still measure neuronal activity meaningfully enough to derive some insight, but we’re not there yet.
What other variables could be set based on quantified data about the self? Video playback speed is a good one, but what else? I’ll throw some ideas out but would love to hear other people’s ideas. Tweet me or comment!
Temperature of your house? Nest somewhat learns user temperature preferences with reinforcement learning, but could it determine an optimal temperature with only quantified self data?
Brightness and temperature (colour profile not heat) of your screens to allow more natural and healthy circadian rhythm regulation. F.lux allows one to set up a schedule of screen temperature, using fixed user preferences, the time, and the sunrise and sunset as inputs. What if we could control this better? Maybe a rule that won’t lower the temperature unless you are at home? Maybe using ambient lighting levels from IoT devices?
I’m sure there are a lot of possibilities enabled by the combination of real-time quantified self data. It feels potentially powerful, and something that has only been enabled with recent developments in technology. Mobile phones and IoT devices for data sources. Big data architectures for processing. Cloud computing for flexible configuration.
What could we optimise about the human condition?