Smartphone aren’t yet that smart (at least if you asked Charles Darwin)

Darwinism basically concludes that if smartphones were people, they’d be dead. Of course I love these brilliant little devices. Who wouldn’t be amazed to walk around holding their pocket a computer which just a few decades ago would have weighed several tons in order to match its processing power? But the reality is smart phones aren’t yet that smart because human intelligence isn’t just about your capabilities in a vacuum but rather your adaptability to environments. Right now the best my smartphone can figure out is time and place: where am I and when is it. but that’s not enough. Imagine if they could answer these other questions and make the answers dynamically available to developers.

“What am i doing?”
whether via accelerometer or just calculating change in position via GPS, my phone can start to make conclusions about what I am doing. For example, if it is moving quickly and also plugged in to charge, it should know that it is likely in a car. Why is this interesting? because I would love many apps to default to a ‘car mode’ which could for example, give me a simpler screen UI so I can hit the one or two most important buttons for your app while I am driving.

by just using speed, stability, local time and pattern recognition, my phone should be able to make educated guesses as to whether I am awake or asleep, still or walking or running, etc. tie this into my calendar and you really start to have some interesting possibilities for determining state.

“How healthy is the phone”
Can apps access battery status? I don’t think so and have never understood why my phone doesn’t make dynamic power management choices when the battery is low. For example you could decrease the refresh rate on apps running in the background.

“Have I been here before”
my phone certainly should know whether it is entering a geography that it has been before or not. Would this data be interesting to developers? I’m fairly certain all the mobile-local-social apps would like to know whether I was a first-time visitor to this neighborhood or a regular. It would also be very cool to have a map view of my city color-coded by the density of my previous visits – sort of a heat map for where I’ve been.

“Do I recognize the people around me”
this one is probably the most controversial but what if phones could sniff one another’s device IDs. From a biological perspective species definitely evolve with a heightened ability to detect strangers in the pack. What would the device driven equivalent be?

just some basic thoughts – what are other ideas for how this data could be used productively (obviously there are a ton of abuse vectors)? what other ways could devices better mimic human intelligence and awareness?

2 thoughts on “Smartphone aren’t yet that smart (at least if you asked Charles Darwin)

  1. this is all fascinating. beyond just isolated sensor identification, though, i get most excited about the social and pattern-based actions. for instance, if you usually go for a coffee every day between lunch and dinner, can it recommend places if you happen to be in a new area? can that recommendation be social (based on friends' activities)? can my phone figure out what i want to do, no matter where i am and what time it happens to be?

    a lot of amazing things can be done with sensors, patterns can be established from historical data, and a lot of depth can come from location data with layered-on metadata. the overlap of the three is where things get super interesting, at least to me.

Comments are closed.