As per the report of a new research, the time is not so far when apps can have the capability to detect the mode of transportation you are using and then suggest you with relevant advice.
If researchers from the University of Sussex’s Wearable Technologies Lab are to be believed, the machine learning techniques developed in a global research competition they started could also help smartphones getting the ability to predict upcoming road conditions and traffic levels on a particular route and after that offer suggestions and recommendations about next route or parking. The phones may even become able to detect food and drink consumed by a phone user while they are travelling. The study was published in the Journal of the ACM.
“Previous studies generally collected only GPS and motion data. Our study is much wider in scope: we collected all sensor modalities of smartphones, and we collected the data with phones placed simultaneously at four locations where people typically carry their phones such as the hand, backpack, handbag and pocket,” said study author Daniel Roggen.
“This is extremely important to design robust machine learning algorithms. The variety of transport modes, the range of conditions measured and the sheer number of sensors and hours of data recorded is unprecedented,” he added.
The team lead by Roggen worked on collecting the data which is equivalent of more than 117 days’ worth of data by monitoring several aspects of commuters’ journeys in the United Kingdom by using various transport means and at the end created the largest data set to avail for the public.
The project involved gathering of data from four mobile phones being carried by researchers on their routine commute on daily basis for a period of seven months.
The team started by launching a global competition challenging teams to come up with most accurate algorithms to be able to recognize eight modes of transport ( which included sitting still, walking, running, cycling, travelling by bus, train or subway) from the whole data collected from 15 sensors and this measured everything from movement to pressure.
The whole project was done by 17 teams and every two entries achieved results with above 90 percent accuracy out of which eight got accuracy between 80 to 90 percent and the other nine got accuracy between 50 and 80 percent.
Now, the data set is hoped to provide wide usage for a range of studies like electronic logging devices, mobility pattern, exploration of transportation mode recognition, tracking, localization and sensor fusion.
“By organising a machine learning competition with this dataset we can share experiences in the scientific community and set a baseline for future work. Automatically recognising modes of transportation is important to improve several mobile services – for example to ensure video streaming quality despite entering in tunnels or subways, or to proactively display information about connection schedules or traffic conditions,” said Roggen.