Cornell Bowers College of Computing and Information Science
A color graphic showing a man using 'MeCapture'

Story

App creates time-lapse videos of the body for telemedicine

December 11, 2024

By Patricia Waldron

A new app developed by Cornell researchers helps users record highly accurate time-lapse videos of body parts – a surprisingly difficult task and an unmet need in remote medicine and telehealth applications.

A team led by Abe Davis, assistant professor of computer science in the Cornell Ann S. Bowers College of Computing and Information Science, created the app, called MeCapture. It uses augmented reality, 3D tracking and interactive visual feedback to help users consistently capture images of body parts day after day and collect medically relevant information outside of the clinic.

A screenshot from the MeCapture app.
A screenshot from the MeCapture app.

Nhan Tran, a doctoral student in the field of computer science and lead researcher on the project, presented the work, “Personal Time-Lapse,” at the ACM Symposium on User Interface Software and Technology (UIST ’24), Oct. 15 in Pittsburgh.

The app’s inspiration came from challenges during the COVID-19 pandemic. “A lot of patients were communicating with their doctors using data they were capturing on their phones,” Davis said.

However, previous time-lapse apps weren’t up to the task.

“Patients might take pictures from different angles or under different lighting, and this makes it hard to compare two photos and tell if the patient is getting better or worse,” he said. “The solution is to make sure the only thing changing from one picture to the next is the patient’s condition. But this requires precise tracking, and existing on-phone trackers break down on subjects that move and bend like the body.”

Davis, Tran and Ethan Yang, a doctoral student in the field of computer science, collaborated with Angelique Taylor, assistant professor of information science at Cornell Tech and Cornell Bowers CIS, who specializes in human-robot interaction, especially in medical settings.

The researchers identified three factors that must be replicated to yield medically useful information: the pose; the viewpoint of the camera; and the lighting conditions. They integrated multiple new functions into a camera-based app to control these variables.

A color drawing showing a person with a phone, taking pictures of their head, hand, and feet
A reference observation of the target body part is captured to indicate the viewpoint and body pose that should be recorded.

To ensure that the body part is posed consistently, the app overlays the original photo onto the screen so that the user can get into the correct position. Each pixel also lights up red, blue or green to signal when the body part is too close, too far away or just right, respectively.

For replicating the correct viewpoint of the camera, the app shows a set of rings with crosshairs in the center of the screen that users must align.

Finally, because the light changes constantly depending on the location and time of day, the researchers relied on the only light they could control – the flash. The app snaps two photos in quick succession – one with the flash and one without – and then subtracts the background light in the second photo from the first one. Using only the light from the flash keeps the lighting of the body part consistent throughout the time-lapse.

The researchers also added an automatic recapture feature that snaps the photos as soon as the camera is properly aligned, so that the act of pressing the camera shutter button doesn’t throw off the alignment – a common issue during initial testing.

Davis and Tran tested MeCapture by tracking their facial hair growth and recording the life cycle of a Chia Pet. Another lab member accidentally burned his hand and one researcher tracked a foot bruise healing, all of which can be seen in their interactive 3D gallery.

The app in its current form provides a very general solution for collecting data for telemedicine, but the technology can be tailored for specific medical applications.

“A potential extension would be to communicate with the doctors – to look at the data and see how we can refine the application and design some sort of interface for the doctors to help the user capture even better data,” Tran said.

Beyond medicine, this type of app could record the growth of plants for field research and monitor structural health in buildings – tracking damage, such as corrosion or a crack, to see if it spreads. 

They are currently developing this application by recording the construction of the new Bowers CIS building on Hoy Road.

This work received partial support from the National Science Foundation and Meta.

Patricia Waldron is a writer for the Cornell Ann S. Bowers College of Computing and Information Science.