An example of the type of technology-driven personalized content I've been talking about lately. But, as I've mentioned many times before, this tech could be used for so much more than selling beer to women.
Yesterday, I gave a talk to a group of Indianapolis UX professionals and students at our local Indy UX Salon. I talked about where I hope technology is headed and how my vision might effect User Interface Design. The slide deck you see here won't make much sense if you weren't at the talk but I wanted to make it available nonetheless.
The gist of my presentation was this, as a UX/UI Designer, I want to know as much information about my user as possible. You can think of this as a sort of extreme version of Responsive Design. In my vision for our future, every action we take is observed and recorded by the environment around us. And I mean every action. Sensors are getting smarter and better so that we can know things like stress level, precise location, what we see and how we react to it verbally or physiologically. That massive set of data can then be analyzed, interpreted, and in turn, affect the the UIs around us to by hyper-personalized.
Imagine the power of a screen knowing its viewer is dyslexic and it being about to reformat all text to Dyslexie. Or, imagine being able to explore a new city without consonantly staring down at your phone for information. If the environment senses we are nervous or lost, it can immediately surface directions to us (because it also knows where we are going). There are a lot of concerns about this new world (privacy, visual overload, abuse by advertisers, etc.) but I want to start thinking about from a positive perspective now and see if we can't get a head start on designing more useful interactions.
The message I gave last night was this: I want to live in a world where our technology is not constantly demanding our attention. I want to get to the point where we can just act naturally and our environment reacts accordingly. The less direct user input the better; the system should know what we want and react accordingly. I think we're closer to this world than you might think and I'm growing tired of living life through the demanding metal and glass rectangles that surround us. I don't have many answers yet. I'm just trying to start the conversation.
As someone who lives with asthma and has taken asthma medication since before I can remember, I'm not sure the people at Amiko are solving a real problem here. Regardless, I'm interested to see where it goes and I wish they supported my inhaler. I've always wondered if I'm holding my inhaler correctly to get the right dosage of medicine. It seems like this could solve that problem. However, I've never found difficult to remember to take my inhaler.
As always, I'm glad to see technology unobtrusively integrated into our lives to make them slightly better. Although some of the recent onslaught of this type of product feels a little awkward. I'm ready for the day when this type of tech is just integrated in to the inhaler itself—no add-on needed.
Technology for the masses like this also brings up a bigger issue. Notice that you have the option to purchase an Amiko Hub if you don't have a smartphone to sync the tracker with. Imagine how untenable our homes will get if we have a different hub for each "smart" device we own. We still have a long way to go before the infrastructure of our world is ready to accommodate an all-encompassing smart environment.
At Midwest UX 14 last weekend, it was really good to see so many people talking about ubiquitous computing/IoT topics. One really nice takeaway from those discussions was learning about Estimote. They make iBeacon sensors and proved dev kits to make simple apps. Their latest product is a set of 10 small stickers that act as sensors, accelerometers, and thermometers. I'm looking forward to getting my hands on these to try out in class.
We get closer and closer to Minority Report every day. This is a silly example of how you could use facial recognition technology. But it's nice to see that's becoming available enough to do stunts like this.
The second video in Microsoft's Connecting Series (in which they only release a video every 2 years??). Not about makers really. I'm not really a fan of the over-the-top feeling of the video (they make these interactions almost seem spiritual) but I'm glad Microsoft has enough money and reach to talk about the future. I think it's naive for companies to release spec videos but in this case, they seem to be doing it to serve the community rather than themselves. At least, I hope.
Pew has been putting out some great reports on the future of digital life and IoT. There is definitely a scary side to the IoT. I can only imagine the reactions to statements like this:
We need to find ways to tell better stories about how the IoT. Especially because most of the interactions happen machine-to-machine, people only need to have a concept of how their life can be impacted.
The biggest thing for me out of this report is that computers will disappear in the near future. I can't wait for this. That is the most exciting thing as UI designer. The "user interface" will come to mean the "visible world" not just the screen. One funny thing about wearable computers:
Have wearable computers ever been cool? Ever seen someone wearing Glass? They're not cool. Find the full report here.