+

Cookies on the Business Insider India website

Business Insider India has updated its Privacy and Cookie policy. We use cookies to ensure that we give you the better experience on our website. If you continue without changing your settings, we\'ll assume that you are happy to receive all cookies on the Business Insider India website. However, you can change your cookie setting at any time by clicking on our Cookie Policy at any time. You can also see our Privacy Policy.

Close
HomeQuizzoneWhatsappShare Flash Reads
 

Apple and Google are finally making everything they touch supremely intelligent, and that has massive implications

Jun 11, 2015, 00:38 IST

Both Apple and Google addressed the future of their respective software platforms over these last two weeks, and both made it clear that intelligence will be a major emphasis moving forward. 
Google has Google Now, the service that helps Android users by making recommendations based on their data - anything it can pull from Gmail, like flight information, restaurant reservations, package tracking, movies, and more. At Google's developer conference I/O in late May, Google one-upped Now with a new product, called Google Now On Tap. While Google Now was only accessible via apps like Search and Chrome, Now On Tap is actually part of the Android platform. It can basically read anything on your phone's screen to give you more information about what you're looking at in that very instant - so if you're reading a text about a new restaurant, you'll immediately see information about that place, including the ability to make a reservation. It's Google's idea of a more proactive assistant. You have to see it to believe it.Apple also emphasized intelligence at the keynote for its Worldwide Developers Conference on Monday, announcing more smart features across both the desktop and mobile platforms. For instance, Mac owners will be able to use Spotlight search with more natural language queries like, "Show me documents I wrote last June," and it can also suggest people to contact, places nearby, sports scores, and more, based on your searches.  

Advertisement

Siri is also going to be more proactive thanks to iOS 9, so your mobile devices will be able to learn your habits and activate certain apps at certain times. So, Siri can pull up your favorite workout music if you plug in your headphones around a certain time or place, for instance, or know to pull up your podcasts if you connect to your car. It can even find unknown phone numbers by looking through your mail and contacts. It's not at the level of Now On Tap, but Siri in iOS 9 will be extremely useful for staying organized, searching, and discovering.Both Apple and Google are emphasizing this kind of intelligence across all their apps, too. Google's new photos application, which lets you store a lifetime's worth of photos and videos, can find certain pictures you're looking for without you needing to label every single photo. Google's artificial intelligence has face matching technology and object recognition so it can "see" everything in your photos - so if you search "beach," or "dogs," it can pull those up for you immediately. 

Apple, meanwhile, wants to use intelligence to branch out into others' applications, so it announced a big move on Monday: Apple will open up its search API so developers can get their apps in Spotlight search results. In other words, if you want to go on vacation, you can swipe left from the home screen to access Spotlight search, search a destination in Spotlight - maybe Hawaii, or Paris - and Siri can show you specific listings from Airbnb for those places, or point you to flight reservations via Kayak, as long as you've downloaded those apps. Or maybe you want to cook a dish using a new food you just bought, like goat cheese: Spotlight search can pull up recipes from that cooking app you downloaded awhile ago, or point you to the Wikipedia page.Obviously, these kinds of intelligent services have broad implications for many businesses as well as our own lives, but of these two companies, only Apple is emphasizing anonymity and privacy.

"We don't mine your email, your photos or your contacts in the cloud to learn things about you - we honestly just don't want to know," Craig Federighi, Apple's VP of software engineering, said Monday. "All of this is done on device and it stays on device, under your control. And if we do have to perform a lookup on your behalf for instance for traffic conditions for instance, it's anonymous, it's not associated with your Apple ID, it's not linked to other Apple services, and it's not shared with third parties. Why would you do that? You are in control." 
Google, on the other hand, wants to know everything about you, from your photos to your music to your apps. Yes, it will hand off some of that data to developers and third parties, but it's all for the sake of convenience and delight, and Google Now on Tap sure does all of that. The fact is, Google already knows a lot about you through Search and Gmail, so many people don't mind it using this knowledge to present music recommendations, news updates, or reminders. Moving forward, expect Apple and Google to emphasize the strengths of their intelligence platforms - privacy on iOS, and functionality on Android - as they advertise new products: Google Photos already looks like the best photos tool ever created, and it will certainly be interesting to see how Apple's deep-linking works in iOS 9 searches. Artificial intelligence is still a major sticking point in the tech world, with many smart people like Elon Musk and Stephen Hawking concerned about its future, but in small doses like this, Apple and Google are showing it can be wonderful.

NOW WATCH: The 12 best new features coming to the iPhone

Please enable Javascript to watch this video
You are subscribed to notifications!
Looks like you've blocked notifications!
Next Article