Google
Google has Google Now, the service that helps Android users by making recommendations based on their data - anything it can pull from Gmail, like flight information, restaurant reservations, package tracking, movies, and more.
At Google's developer conference I/O in late May, Google one-upped Now with a new product, called Google Now On Tap. While Google Now was only accessible via apps like Search and Chrome, Now On Tap is actually part of the Android platform. It can basically read anything on your phone's screen to give you more information about what you're looking at in that very instant - so if you're reading a text about a new restaurant, you'll immediately see information about that place, including the ability to make a reservation. It's Google's idea of a more proactive assistant. You have to see it to believe it.
Apple
Siri is also going to be more proactive thanks to iOS 9, so your mobile devices will be able to learn your habits and activate certain apps at certain times.
So, Siri can pull up your favorite workout music if you plug in your headphones around a certain time or place, for instance, or know to pull up your podcasts if you connect to your car. It can even find unknown phone numbers by looking through your mail and contacts. It's not at the level of Now On Tap, but Siri in iOS 9 will be extremely useful for staying organized, searching, and discovering.
Both Apple and Google are emphasizing this kind of intelligence across all their apps, too. Google's new photos application, which lets you store a lifetime's worth of photos and videos, can find certain pictures you're looking for without you needing to label every single photo. Google's artificial intelligence has face matching technology and object recognition so it can "see" everything in your photos - so if you search "beach," or "dogs," it can pull those up for you immediately.
Apple
Apple, meanwhile, wants to use intelligence to branch out into others' applications, so it announced a big move on Monday: Apple will open up its search API so developers can get their apps in Spotlight search results.
In other words, if you want to go on vacation, you can swipe left from the home screen to access Spotlight search, search a destination in Spotlight - maybe Hawaii, or Paris - and Siri can show you specific listings from Airbnb for those places, or point you to flight reservations via Kayak, as long as you've downloaded those apps. Or maybe you want to cook a dish using a new food you just bought, like goat cheese: Spotlight search can pull up recipes from that cooking app you downloaded awhile ago, or point you to the Wikipedia page.
Obviously, these kinds of intelligent services have broad implications for many businesses as well as our own lives, but of these two companies, only Apple is emphasizing anonymity and privacy.
Apple
Google, on the other hand, wants to know everything about you, from your photos to your music to your apps. Yes, it will hand off some of that data to developers and third parties, but it's all for the sake of convenience and delight, and Google Now on Tap sure does all of that. The fact is, Google already knows a lot about you through Search and Gmail, so many people don't mind it using this knowledge to present music recommendations, news updates, or reminders.
Moving forward, expect Apple and Google to emphasize the strengths of their intelligence platforms - privacy on iOS, and functionality on Android - as they advertise new products: Google Photos already looks like the best photos tool ever created, and it will certainly be interesting to see how Apple's deep-linking works in iOS 9 searches. Artificial intelligence is still a major sticking point in the tech world, with many smart people like Elon Musk and Stephen Hawking concerned about its future, but in small doses like this, Apple and Google are showing it can be wonderful.