Saturday, August 20, 2011

From Agile Development to "Alive Development"

Om Malik and others have recently blogged about the emergence of the "Alive Web" - which is all about the live interaction with others, the real-time, the here-and-now. has been one of the representative example of it, and also Chatroulette.

People talk about how Google Plus also supports this trend, giving the Google+ Hangouts as example. I agree the Google+ launch has taken the Alive Web one giant leap forward - but for me the major "WOW" moment is not about any of its end-user features like Hangouts, Huddles, or the new Games. What is the most amazing leap for me is Google's "Alive" development and iteration of the Google+ product.

I have NEVER seen any company be as open and as interactive about feature and UX design of a product in the scale of Google+. This is not about how high-ranking Googlers are publically active on the product while some exects of other companies that don't use their own product (+Thomas Hawk mentioned Carol Bartz doesn't have a Flikr account). I am talking about how, from day one, the Google+ team is in the frontlines, using the product itself to generate a live conversation with the users, eliciting feedback through the feedback button, posts that ask about general or specific feedback, responding to user's posts about suggestions and/or complaints, or holding public hangouts for Q&A. Every few days we see another post and video with new features, directly responding to user comments and requests. Some recent examples by product manager +Shimrit Ben-Yair, designer +Jonathan Terlesky, and software wizard +Andy Hertzfeld, herehere, and here
This is way beyond the traditional "agile development" methodologies. This is a near-realtime development cycle, with continuous user engagement, feedback, and response.
What we're seeing here is the Alive Development of the Alive Web.

Thursday, August 18, 2011

SXSW Talk Proposal

My SXSW plug:
I proposed a talk for the upcoming SXSW based on my PhD research and the work we are doing at the Human Dynamics group at the MIT Media Lab. I think its going to be awesome, and I'm not biased at all.
Details below, Please Vote!

Investigating Social Mechanisms with Mobile Phones
Imagine an imaging chamber placed around an entire community. What if we could, with permission, record and display nearly every facet of behavior, communication, and social interaction among its members as they live their everyday life? This potential would afford rich insights into humanity - how societies operate, how real world relationships form and change over time, and how behavior and choices spread from one person to another. We could diagnose the health of a community, and of its individuals. We could even measure the effects of feeding this information back to them.

At the MIT Media Lab, we have built the beginnings of what we call “The Social MRI.” You don’t need a huge chamber – just a bunch of modern smartphones. Using our mobile sensing software, we transformed a residential community into a living laboratory for over 15 months. Many signals were collected from each participant, altogether comprising what is, to date, the richest real-world dataset of its kind. As part of our continuing research, we are developing new tools to realize "the quantified self", and architectures to do all of this from a user centric perspective – where individuals own their data, and privacy is embedded into the framework.

This talk will highlight surprising results from the study, introduce our open source tools developed for data collection, and discuss how the lessons learned could extend to improve the consumer and business worlds.

Questions The Talk Will Answer:
  • How can we design new mechanisms of social support (e.g. for increasing physical activity), and measure their performance with a real community?
  • How can mobile phones be used to infer real-world social signals, relationships, and other personal and group characteristics?
  • How is it possible to preserve user privacy while still enabling today’s data collection and advertising-driven business models?
  • Who has more influence over the mobile apps that you install on your phone – your friends on Facebook, your “real” friends, or the people you just hang out with?
  • What tools can we provide to developers and researchers to build apps for smart mobile sensing that are both secure and battery efficient?

Vote for My SXSW Idea!

Sunday, August 14, 2011

Step aside "Wearable Compiting", "Epidermal Computing" is the new buzz!

"Epidermal Electronics" - Remember this term, because I am sure we'll be hearing more of it in the future. Ars Technica writes about this amazing new technology of  "Epidermal Electronic System" (EES). Basically, its a "technology that allows electrical measurements (and other measurements, such as temperature and strain) using ultra-thin polymers with embedded circuit elements. These devices connect to skin without adhesives, are practically unnoticeable, and can even be attached via temporary tattoo."

Check out the cool video by Northwestern:

Tattoo electronics could have medical applications from Northwestern News on Vimeo.

For an in depth read and pictures that "show a lot of skin", I recommend diving into the full Science paper, and this Science perspective article that talks about the potential of the technology for medical applications.

What I thought was way cool is how they show proof of concept for solar power or inductive power sources (read: wireless power up, like RFIDs). I wonder if piezoelectrics could be used to power such epidermal devices from the motion of the wearer, or maybe harvest the person's body heat... (did anyone say Matrix?)

The authors discuss and show feasibility of RF based wireless communication, but I wonder if you can also do something like Body Area Networks where multiple epidermal devices could communicate with one another using the human body as its medium - so that you could have one device responsible for aggregating the sensor data and transmitting it out. And if we go this far, why not person-to-person communication, ala Jay Silver's ok2touch, but with all epidermic computing:

How about epidermal peer-to-peer music sharing?

I can already picture Hallmark making a kiss-activated-epidermal-electronics-musical-greeting-card-tattoo for Valentines day (Hallmark, lets talk royalties. Call me).

And can you imagine how the TSA would react to this tech?

Do you have other ideas for EES applications?