New apps that will listen to conversations or scan emails and calendars will be able to predict and provide information such as websites, videos and maps to users before they ask for them or realize they want them, Reuters reported.
A voice conferencing assistant, MindMeld for the iPad, will understand conversations, and depending on what people are talking about, will find a map of a city and tourist information if the chat is about a trip, or restaurants if it's about eating out.
"Imagine a situation where you're on the phone or talking with a friend," Timothy Tuttle, founder of San Francisco-based Expect Labs, which created the app, said in an interview. "wouldn't it be great if your phone could automatically find the information you're talking about and display it at the right time?"
Users will log on to the app with Facebook to use MindMeld. The app, which costs $3.99, detects words and phrases related to current events and local businesses in order to search the Internet while people are speaking to each other to gather more information related to the conversation, Reuters reported.
If the app detects a user has an appointment, for example, it provides a map and traffic conditions.
Up to eight people can join in on a chat. To protect privacy, conversations are not recorded or stored, according to Reuters.
Tuttle predicts that during the next few years computing devices will move from laptops, smartphones and tablets to everyday objects like a table or wearable technologies such as Google Glass. The motivation for creating MindMeld app was the belief that new devices will not have keyboards.
"Tomorrow, our computing devices will pay attention continuously, anticipate what information might be relevant, and be ready at a moment's notice to give you the exact information you need," Tuttle said.
The devices and apps will listen to what people are saying, or watch what they are reading and writing, as well as the places they visit. The company also plans to release iPhone and Android apps, Reuters reported.
"By interpreting these contextual signals, our apps and devices will become much better at finding the information we need, in some cases, before we even need to ask," Tuttle said.