This is my dumping ground for quotes and other stuff relating to the wonderful world of digital & communications.
What does it mean that Google really is trying to build the Star Trek computer? I take it as a cue to stop thinking about Google as a “search engine.” That term conjures a staid image: a small box on a page in which you type keywords. A search engine has several key problems. First, most of the time it doesn’t give you an answer—it gives you links to an answer. Second, it doesn’t understand natural language; when you search, you’ve got to adopt the search engine’s curious, keyword-laden patois. Third, and perhaps most importantly, a search engine needs for you to ask it questions—it doesn’t pipe in with information when you need it, without your having to ask.
The Star Trek computer worked completely differently. It understood language and was conversational, it gave you answers instead of references to answers, and it anticipated your needs. “It was the perfect search engine,” Singhal said. “You could ask it a question and it would tell you exactly the right answer, one right answer—and sometimes it would tell you things you needed to know in advance, before you could ask it.
Arnold also sees virtual assistants as intellectual equalizers. A superb memory might cease to be an advantage as intelligent assistants are tasked with remembering names, dates and other details. Everyone will have the ability to see unusual but important connections between legal cases or patients’ symptoms, thanks to assistants that can identify relevant precedents or files.
In this world of huge and big data, you won’t be able to program machines for everything they should know,” said Ms. Rometty. “These machines will have to learn what is right, what is wrong, what is a pattern.” It is the third wave of computing, she said. At first, computers could count. Today, they are programmed to follow “if this, then that.” Next they will need to discover and learn on their own, she said, not just as a search engine, but proactively.
Expect Labs, a San Francisco start-up, have spent the past two years building an “anticipatory computing engine” - a platform for applications that predicts what people want or need before they explicitly ask or search for it.
Its first app for the iPad is MindMeld, a group voice and video-calling app that analyses what’s being talked about in real-time and “predicts” the type of information participants may want or need, pushing it to their tablets within seconds.
MindMeld listens to its users’ conversation and brings up related information
For example, let’s say, several co-workers are planning to meet up for bar snacks after work.
Depending on what types of food, drinks and possible meeting places are mentioned, MindMeld “listens” in the background and pulls up pertinent restaurant suggestions, reviews, maps, images and phone numbers using data from across the web and social networks
Google, on Tuesday, was awarded a patent for “advertising based on environmental conditions.”… So Google can now deliver targeted ads to users based on their surrounding environment. For example, the patent notes, temperature information gathered by a phone’s sensors can be used to flash ads for air conditioners (if temperatures exceed a certain thresholds), or winter coats (if the temperatures fall below a certain benchmark).
Sensor info isn’t the only environmental information Google wants to analyze with the patent. Google also wants to analyze background information:
“An audio signal that includes a voice instruction from a user of the remote device can be received, and the environmental condition can be determined based on background sounds in the audio signal,” the patent reads.