When Apple CEO Tim Cook said that the two new iPhones released today "changed everything," he was referring to the introduction of a new technology called "3D touch."
The screens on the new phones can recognize pressure, meaning that you can do different things depending on how hard you touch the screen. A press and a longer, deeper press will now have different effects than the usual tap.
This opens up a whole range of new shortcuts for Apple. For example, you could get directions home simply by long-pressing on the Maps app on your homescreen, quickly take a selfie by pressing on the camera app, or check your schedule on a given day by long-pressing the date in, say, an email.
Apple
The feature took many years to create, according to Bloomberg's Josh Tyrangiel.
It was extremely complicated technically and required Apple engineers to integrate new sensors into its iPhone screens that work with embedded accelerometers so it can continuously measure a users' finger pressure and try to figure out what they're trying to do.
"It starts with the idea that, on a device this thin, you want to detect force. I mean, you think you want to detect force, but really what you're trying to do is sense intent," Apple software engineering SVP Craig Federighi told Bloomberg. "You're trying to read minds."
You can get a sense of how complicated this thing really is by reading the rest of Federighi's explanation on the feature:
...You have a user who might be using his thumb, his finger, might be emotional at the moment, might be walking, might be laying on the couch. These things don't affect intent, but they do affect what a sensor [inside the phone] sees. So there are a huge number of technical hurdles. We have to do sensor fusion with accelerometers to cancel out gravity-but when you turn [the device] a different way, we have to subtract out gravity.?… Your thumb can read differently to the touch sensor than your finger would. That difference is important to understanding how to interpret the force. And so we're fusing both what the force sensor is giving us with what the touch sensor is giving us about the nature of your interaction. So down at even just the lowest level of hardware and algorithms-I mean, this is just one basic thing. And if you don't get it right, none of it works.
Read the rest of Bloomberg's piece here and watch Apple's video about 3D touch: