Smartwatches Begin to Decode the Hidden Patterns of Everyday Life

Smartwatches are already powerful tools, tracking our steps, heart rate, and even sleep patterns. But imagine if they could go beyond movement—if they could tell whether you were cooking dinner, working at your desk, or helping a loved one with errands. That’s exactly what a team of researchers at Washington State University (WSU) has been working toward.

For years, scientists have been able to identify simple physical movements—like walking or sitting—using wearable devices in lab-controlled settings. Now, WSU researchers have taken a leap forward: they’ve trained a computer algorithm to recognize a much richer set of everyday activities. Their model doesn’t just detect steps or posture—it can infer if someone is eating, socializing, running errands, or simply relaxing at home.

A New Lens on Everyday Health

The advance, described in the IEEE Journal of Biomedical and Health Informatics, is more than just a technical achievement. It could open new doors in healthcare, rehabilitation, and caregiving—fields where understanding a person’s daily life is as important as any test result.

“If we want to determine whether somebody needs caregiving assistance in their home or elsewhere and what level of assistance, then we need to know how well the person can perform critical activities,” said Diane Cook, Regents Professor in WSU’s School of Electrical Engineering and Computer Science, who led the study.

Can someone reliably cook for themselves, manage money, or do their own shopping? These are the building blocks of independence. And until now, measuring such activities outside of a doctor’s office or care facility has been difficult, imprecise, and often dependent on self-reporting.

Why Daily Activities Matter

One of the biggest challenges in medicine, especially for aging populations or people living with chronic illness, is knowing how they function day-to-day. Medical tests can reveal blood pressure, blood sugar, or lung capacity. But they don’t show whether someone is still preparing meals, handling bills, or enjoying hobbies.

For families caring for aging parents from a distance, this uncertainty can be painful. Are they still eating well? Are they staying socially engaged? Are they safe running errands on their own? These are deeply human concerns—and now, technology may help provide clearer answers.

“Lack of awareness of a person’s cognitive and physical status is one of the hurdles that we face as we age,” Cook explained. “Having an automated way to give indicators of where a person is will someday allow us to better intervene for them and to keep them not only healthy, but ideally independent.”

Building a Groundbreaking Dataset

To train their AI, the WSU team collected smartwatch data from 503 participants over the course of eight years. Whenever participants took part in studies that used smartwatches, they were asked to occasionally label what they were doing at random times of the day.

They chose from 12 broad categories, such as working, traveling, eating, socializing, running errands, sleeping, or relaxing. This massive effort generated more than 32 million labeled data points, each representing a single minute of activity.

By analyzing these points with different artificial intelligence techniques, the researchers trained an algorithm that could recognize activities with nearly 78% accuracy—a striking achievement, considering the complexity of human behavior.

The Power of Patterns

Activity recognition is not just about knowing what someone is doing at a single moment. It’s about piecing together patterns over time. A change in daily rhythm—less time cooking, more time sleeping, fewer errands—can signal a shift in health or independence.

“A foundational step is to perform activity recognition because if we can describe a person’s behavior in terms of activity in categories that are well recognized, then we can start to talk about their behavior patterns and changes in their patterns,” Cook said. “We can use what we sense to try to approximate traditional measures of health, such as cognitive health and functional independence.”

Looking Ahead: Smarter, More Human AI

The implications of this research are vast. Clinicians could someday use smartwatch-based monitoring to help diagnose cognitive decline earlier, track recovery after surgery, or personalize rehabilitation plans. Families might gain peace of mind knowing their loved ones are safe and active at home.

The WSU team has also made their dataset—stripped of identifying information—publicly available, so other scientists can build on their work. Future studies could explore how genetics, environment, and health conditions shape daily activity, and how subtle changes in those activities could act as early warning signs.

For now, the achievement marks a major step toward more behavior-aware technology—wearables that don’t just measure the body, but also illuminate the life being lived.

More information: Bryan Minor et al, A Feature-Augmented Transformer Model to Recognize Functional Activities from in-the-wild Smartwatch Data, IEEE Journal of Biomedical and Health Informatics (2025). DOI: 10.1109/JBHI.2025.3586074