Sensys'12 (day 1)
I am attending Sensys'12 at Toronto, which is a short distance from Buffalo. Sensys is a premiere conference for wireless sensor network and, more recently, smartphone sensing research. Here are my rough notes from the first day of conference.
Energy-efficient GPS sensing with cloud offloading
This paper is really good, it blew me away. Later at the conference banquet Wednesday night, the paper got selected as the Best Paper. This work aims to break up the GPS as a blackbox and see how we can make it more energy-efficient. The paper goes into fairly technical detail about how GPS works, which I cannot replicate here. Some basics are: GPS uses CDMA to transmit 50bps information over a carrier signal of 1.575Ghz. Seeing more satellites improves accuracy. GPS localization requires the following phases on the device: acquisition, tracking, decoding, code-phase, least-square calculation.The insight of the paper is to offload some of these phases to the cloud, so the device does less work and spends less power. For example, the codephase involves getting the "ephemeris" from the signal for learning the location of satellites, but this look up table is already made available by NASA and alike in cloud. So that can be easily offloaded to the cloud, and the device does not need to do it. Another trick is coarse time navigation. If a nearby landmark has same/close nms for the GPS signals (nms= number of milliseconds for time of flight of GPS signal), instead of calculating the location, use the landmark's location. If there is no such known location, then leastsquare calculation may give several possible locations, but there is still hope. It is possible to use the elevation information (the height dimension often ignored in the lattitude-longitude pair) as a checksum and eliminate the spurious possibilities and to figure out the correct location. The elevation data and the computation power to do this is available at the cloud side.
Using similar tricks, they reduce the duration of the acquisition phase at the device and the amount of information to be gathered by the device from the satellites. They develop hardware for this (they call this Co-GPS design). The Cleon prototype is comprised of max2769 gps receiver + msp430 as the chip, and talks to the phone or device (ipod?) using audio communication through the headphone jack. Cleon ends up sampling GPS only for 2ms (16 mHz sampling, 2 bit/sample, 2ms) as that is enough for accurate localization (within 30meters) via offloading to the cloud-side. The awesome thing about Cleon is that it can run on 1.5 years on 2AA battery with continuous sampling. Wow! (This calculation does not take into account the communication cost to cloud as this is done by the device Cleon connects to and can be over wifi, bluetooth, etc.). A byproduct of Cleon is atomic time stamping for outdoor sensors. They will opensource the software and the hardware soon.
Indoor pseudo-ranging of mobile devices using ultasonic chirps
This work presents an ultrasound-based smartphone ranging system. The transmitters for this system is regular speakers, which abound in halls, malls, classrooms, and offices. The sound used is slightly below "normal" human hearing range; they utilize a 4Khz band there. Localization is performed at the smartphone via calculating the time-difference of arrival of sound signals (sound is slow, so this creates an advantage in achieving fine-grain localization) from the predetermined speakers. This is not a new idea. MIT had the Cricket protocol that did this around 2005 for the motes. This work is a refined implementation of the same concepts. Instead of a mote and special receiver hardware, they were able to pull this off with regular smartphone (iphone) microphones. Other noteworthy special touches from the paper are as follows. They use chirp signals, because they found that chirps are error-prone and give better accuracy. When using such chirps, human ear hears click sounds (demonstrated at the presentation). So they employ "psychoacoustic" (perception of sound) properties to solve this. The clicking sounds generated by rapid frequency changes, so they reshape them by adding fading (fading in and fading out) between the chirps, so they become oblivious to your ears. They employ hamming codes in the chirp signals for error detection and correction. As a result, they manage to get localization accuracy with less than 10cm error. A disadvantage is that this method requires line of sight to the speakers, and also that speakers need to be pre-localized and pre-arranged. Another nice touch, they tested this on their cat; the cat, although could hear the sound, didn't seem to be disturbed by it. (They used food association technique to show this.) Another interesting tidbit from the talk is that, 2 people out of the 200 people crowd said they were able to hear the ultrasound even with fading-in/out version. Some people have more sensitive ears it turns out.
IOdetector: a generic service for indoor outdoor detection
This paper aims to provide a service for the smartphone to detect whether it is currently outdoor, semi-outdoor, indoor. The application of this service is to automatically and in "an energy-efficient manner" detect when the phone should not even bother activating the GPS for an app because the phone is indoor or semi-outdoor and the GPS error will be big, and it would not justify the energy wasted by the GPS. The service uses three modalities: light signal, cellular signal, magnetic field signal. Also secondarily, acceleration and proximity & time (to sanitize light readings) are employed. These are all very low energy sensors. The light detector accounts for day or night, and can detect indoor fluorescent lighting. The cellular signal detector uses signal from multiple celltowers to avoid the corner-turning effect. However, each subdetector cannot completely classify the three types (indoor, outdoor, semi), so they fuse these subdetectors employing the confidence results provided by them.
Musical Heart : A hearty way to listen to music
It is a proven fact that music has profound effect on human heart rates. Some music calms, some music gets you worked up. This work presents a biofeedback-based context-aware music recommendation app for smartphones, MusicalHeart. This app requires its own special sensor equipped earphones to achieve the biofeedback part. More specifically, there is an IMU (inertial measurement unit) and a microphone embedded in the headphone. The microphone is used for heart-rate detection (requires some filtering), and the IMU for activity detection (employs k-means clustering). These sensors communicates to the smartphone app via the audio jack, and they are powered by a thin film battery. In addition to the sensors in the headphone MusicalHeart app also uses phone GPS and WAP information for context detection.
MusicalHeart detects what type of activity you are doing (light L1, medium L2, high-level L3) and plays you the "appropriate" music to try to get your heart-rate into the suggested/desired heartbeats per second. The speaker claimed that there is a desired/suggested heartbeat for activity levels, and this is documented in the medical literature. (I am hearing about this the first time.) So, MusicalHeart "custom"-selects a song to fill in the gap between the desired heartrate for your activity and your current detected heart rate. It may use a fast song to increase your heartrate or a slow song to slow your heart rate. Of course things are not that simple. Actually three different features are considered for each song; tempo, pitch, energy. Increase in energy increases your heart-rate more effectively for L1 activities, and increase in tempo increases your heart-rate more effectively for L3 activities. The "custom" part of MusicalHeart is because the app learns over time what kind of music work better on you for the desired effects. The work was evaluated with an empirical study over 37 volunteers. I am still skeptical about if any song can slow my heart rate at L3. I will try this at the demo Thursday (today). Yes, several of the papers below also appear in the demo session so the audience can try if they work as advertised.
The Banquet
The conference banquet was held at the CN tower (the very tall thin tower that is the symbol of Toronto). This banquet is up there for me with the Mobicom'11 banquet. The CN tower was a very interesting and exciting venue to have the banquet. Especially with the glass floor!
Comments