|
||||||
In February of 2004 I was invited by Glenn Edens to join Sun Labs, as part of a new effort there to explore sensor networks. Initially, Randy Smith and I explored this by getting acquainted with the Crossbow Motes. We learned how to program these small computers, and built some sample applications. Some of these were purely whimsical, like measuring the rotation and wobble of a frisbee with the two-axis MEMS accelerometers on the Motes. Others were more explorative, such as a smart doorway equipped with a web camera, a Mote controlled electronic lock, and a Mote controlled doorbell & switch pair. A person wearing a properly keyed Mote who pushed the doorbell would have the door simply open for them, based on the contiguity (radio signal strength) of the wrist-worn Mote and the Mote controlled switch, and on the invocation of the bell push. On the other hand, someone without such a Mote on their person would press the doorbell; the bell would ring; the web camera would snap a picture with a time-stamp and push it out to a website I built - and a server programed by Randy would ring my PDA smartphone with a recorded message telling the door was needing attention. Then I would bring up the web browser on the phone, look at the picture to see who was standing there - and either press a button to unlock the door remotely, or refresh the picture. It made for a cool demo at the Sun Labs Open House that year. Following this initial exploration Glenn decided that we should build our own superior platform for sensor networks. An important strategic goal was for it to promote the Java programming language in this new arena; both in education and in engineering. Roger Meike directed the new group. Bob Alkire, the way smart hardware designer who designed the hardware for the Kava Project that he and I both worked on at AT&T Labs started to design what became the Sun SPOT platform. A team that had been working on a variant of Java called Squawk merged with our team to do the system programming. Randy and I worked on applications, UI, and system ideas, along with others. Working with Bob and others, our team designed a much more capable platform than the Motes. We put a full 32 bit ARM-9 computer with a fast clock, instead of the wimpy 8 bit processor the Crossbow Motes used, so we could do things like FFTs. The radio was an 802.15.4 multi-channel transceiver. The design ran Java "bare on the metal", without an OS; an approach that enabled us to know when all the threads were asleep - and power the device down to a deep sleep mode that consumed only 35 micro-amps, under the control of a tiny satellite processor with a good clock. We put together what was to be called the "kitchen sink" demo-sensor board, with a three axis accelerometer rather than the Mote's two axis device, a large number of GPIO pins, high current drivers, a light and temperature sensor, and a bank of eight 24-bit RGB LEDs. We spent extra effort making it all small enough to be worn on a wrist, to explore gesture UIs. The idea was to build a capability space large enough with so many orthoganal axes that people would be able to think of intents based on those capabilities that we had not thought of. Later, this was amply rewarded by the results of a sponsored term at the Art Center College of Design titled The New Ecology of Things where grad students amazed us by artistic and practical realizations that ranged from augmented reality through smart dorms, gesture based multi-person music and dance environments, to SPOT powered swarming blimps that exhibited emergent behavior. Meanwhile, I continued to explore with my colleagues ideas of how to use such small wireless and wearable devices. I invented and co-invented with my colleagues Randy Smith, John Nolan, Jim Hughes, and Ron Goldman six ideas that were approved for pattent filing; five of these were filed as U.S. patent applications as of September 2006. These ranged from an improved RFID transponder that decayed over a time course; a swarm intelligence approach to theft detection of packages; a gesture based provisioning system for smart device content; the idea of spatially stationary software on mobile devices; a secure tamper respondent crypto key device based on gesture recognition; and an emergent scheme for allocating time slices on a wireless network.
A fun part of working on the SPOT project was designing the "out of the box experience". When Randy and I initially worked with the Crossbow Motes we were frustrated by the bring up process - once out of the box, nothing happened until two days of downloading and installing various packages from several vendors ensued. We wanted something fun to happen within minutes; and explorative programming to quickly follow based on small programs that demonstrated capabilities like using the accelerometers for gestures. A committee of people with ideas for the out of the box experience formed - and it quickly became apparent that there were lots of ideas, and poor communication. I asked Angel Lin, an Art Center intern working at Sun Labs in Brenda Laurel's Star Fleet group, to come help us by making a storyboard of the out of the box experience ideas. Exteriorizing the ideas with the storyboard was extremely useful, and we all settled on an experience that focused on a short sequence of unpacking, installing, running initial software, and quickly exploring by modifying sample demo programs. We decided to put a simplified version of Angel's storyboard right inside the box as the first thing people would see whem unpacking, to guide their experience. We also put a "blue print" of the Sun SPOT, with all the dimensions of the plastics, boards, etc. on the outside of the inner box, so that when unfolded a user could photocopy the box and get dimensional information. For me, a big part of the fun was thinking about what could be done with the affordances Sun SPOTS provided. Gesture interfaces, swarm intelligence... reifying animism, and "magic" as design principles. One of my patent applications has the ancient Celtic making and unmaking circular movements listed as exemplars of gestures used to provision software in smart devices... followed by the phrase "it will be apparent to one skilled in the Art..." The patent attorneys thought that was a hoot! More seriously, I explored how to make animism work - building on the ideas expressed in my emotional robot project at Interval Research, putting perception-representation-action loops inside objects... making them appear to come alive within an internet of things. Additionally, making the idea of making "magical causality" actually function within human narrative intelligence intrigued me, and I developed a theory of such based on four "principles of magic" - Contiguity (e.g., radio signal strength), Contagion (passing software from one device to another), Similarity (spooky action at a distance), and Invocation (gestures, such as a wrist movement, or a button press). Conceived in this manner a t.v. remote control is a magic wand... and it takes two of the four principles working together at any time to make a robust UI. Randy and I called the demos built around this collectively "the ectoplasmic interface", and filed several patents under that rubric. You can find out more about Sun SPOTS at http://www.sunspotworld.com |