Image: Gary Weiss Did you ever wonder what your smartphone knows about you? Or how it learns about you? Wouldn't it be great if it could tell you things that you don't even recognize about how you walk, talk and act?

Smartphones are already capable of doing this, and many researchers are dedicated to finding ways to gather and interpret the most useful information. Modern smartphones are packed with many powerful sensors that enable the phone to collect data about you. Although that may alarm anyone who is concerned about privacy, the sensors also present an opportunity to help smartphone users in previously impossible ways. When Gary Weiss realized how much these sensors could tell about a person, he established the Wireless Sensor Data Mining (WISDM) Lab at Fordham Univ. in the Bronx, N.Y. The goal of this lab is to apply modern machine learning and data mining methods in order to "mine" knowledge about smartphone users from their sensor data. 

Smartphones contain more sensors than most people would ever imagine. Android phones and iPhones include an audio sensor (microphone), image sensor (camera), touch sensor (screen), acceleration sensor (tri-axial accelerometer), light sensor, proximity sensor, and several sensors (including the Global Positioning System) for establishing location.

Early on the researchers decided to focus our efforts on the tri-axial accelerometer, since they felt that it is one of the most informative – and underutilized – sensors. This sensor measures the phone's acceleration in all three spatial dimensions as well as its orientation. This enables the phone to adjust the screen display in response to changes in phone orientation, while also supporting advanced motion-based game play.

Their first goal was to use the accelerometer to perform activity recognition – to identify the physical activity, such as walking, that a smartphone user is performing. They believed that this ability could then be used as the basis for many health and fitness applications, and could also be used to make the smartphone more context-sensitive, so that its behavior would take into account what the user is doing. The phone could then, for example, automatically send phone calls to voice mail if the user was jogging.

They used existing classification algorithms to identify activities, such as walking, and help map accelerometer data to those activities. These algorithms, or methods, learn from specific examples. When given data about U.S. football players and non-football players, such an algorithm might learn that football players tend to weigh over 200 lbs. In their case they provide the algorithm with acceleration data that is labeled with the associated activity, and from this data the algorithm automatically generates rules for identifying the activities. Since these rules can be implemented in software, the activity recognition process can be automated.

The activities that the system can recognize include walking, jogging, climbing stairs, sitting, standing and lying down. They collect a small amount of labeled "training" data from a panel of volunteers for each of these activities, with the expectation that the model that they generate will be applicable to other users. The only assumption that the researchers make is that the user's phone is running the app in the background and that the phone is in their pocket.

Initially, the researchers could identify the six activities listed above with about 75 percent accuracy. These results are adequate for obtaining a general picture of how much time a person spends on each activity daily, but are far from ideal. However, if they can obtain even a very small amount of data that a user actively labels as being connected with a particular activity, researchers can then build a personal model for that user, with accuracy in the 98-99 percent range. This shows that people move differently and that these differences are important when identifying activities.

The system is called Actitracker. If you download the Android app, it will allow you to review reports of your activities via a web-based user interface. This will allow you to determine how active or – perhaps more to the point – how inactive you are. The researchers believe that these reports may serve as a wakeup call to some and hope it will lead to positive changes in behavior. Such a tool could also be used by a parent to monitor the activities of their child, and thus could even help combat conditions such as childhood obesity.

The researchers are also studying what other things they can learn about a user from their accelerometer data. Currently, using this data they can predict a user's gender with 71 percent accuracy, and can distinguish between "tall" and "short" people and "heavy" and "light" people, each with about 80 percent accuracy.

They have also established that one's gait, as measured by a smartphone accelerometer, is distinctive enough to be used for identification purposes. From a pool of several hundred smartphone users, they can identify any individual with 100 percent accuracy if they have a previous data sample. Soon, they hope to be able to use accelerometer data to help diagnose gait problems. This application is important since gait problems are often indicators of other health problems. All of these applications are based on the same underlying methods of classification as the group’s activity recognition work.

This category of applications is part of a growing trend towards mobile health. As new sensors become available and as existing sensors are improved, even more powerful smartphone-based health applications should appear. For example, other researchers are boosting the magnification of smartphone cameras so that they can analyze blood and skin samples. Researchers at MIT's Mobile Experience Lab are even developing a sensor that attaches to clothing, which will allow smartphones to track their users' exposure to ultraviolet radiation and the potential for sunburn.

Smartphone sensor technology, especially when combined with data mining, offers tremendous opportunities for new and innovative applications. Weiss and his colleagues are committed to exploring these applications and expect that there will be a flood of new sensor-based apps over the next decade. While many of these apps may just be curiosities, they suspect that some will "stick" and provide tangible benefits to individuals and society.