+

Cookies on the Business Insider India website

Business Insider India has updated its Privacy and Cookie policy. We use cookies to ensure that we give you the better experience on our website. If you continue without changing your settings, we\'ll assume that you are happy to receive all cookies on the Business Insider India website. However, you can change your cookie setting at any time by clicking on our Cookie Policy at any time. You can also see our Privacy Policy.

Close
HomeQuizzoneWhatsappShare Flash Reads
 

The company behind Sam Altman's iris-scanning ID startup just revealed new tech. Its privacy chief explains how it works.

Sep 19, 2024, 23:57 IST
Business Insider
Damien Kieran is chief privacy officer of Tools for Humanity. Tools for Humanity
  • Worldcoin scans irises to create a secure, encrypted network for identity verification.
  • About six million people use Worldcoin, with companies such as Reddit and Discord integrating it.
Advertisement

In Sam Altman's vision of the future, everyone has a safety net in the form of universal basic income.

And until he recently mentioned "universal basic compute," his master plan was to use an identity verification startup called Worldcoin to distribute funds to people worldwide.

The premise is simple, albeit a bit futuristic. Worldcoin is building a directory of every human by scanning their irises with a baseball-sized orb. From that scan, it creates a unique code users can use to log into other platforms. Eventually, it might also be how humans collect universal basic income.

More than six million people worldwide use the technology. Companies including Reddit, Discord, and Okta are already working with Worldcoin to help users log into their platforms safely. However, it has also caught the attention of authorities in countries like Germany, France, and Kenya, who worry about how the company uses the data it collects.

Worldcoin believes its technology — a private, encrypted network that preserves human identity — is critical, especially as the rapid developments in AI technology have made it harder to distinguish between humans and bots.

Advertisement

As part of that mission, the platform announced new "Face Auth" technology on Thursday. It's a 1:1 face comparison that ensures only the person who verified their World ID can use it. The technology is similar to Apple's Face ID but mobile platform-agnostic, given that many Worldcoin users have Android devices.

It's being overseen by Damien Kieran, chief privacy officer at Tools for Humanity, the company charged with building the technology behind Worldcoin.

The tech industry veteran was previously general counsel at the once-buzzy photo startup BeReal, and former deputy general counsel at Twitter, where he reported directly to Elon Musk.

Kieran told Business Insider about how the company handles user data and how it'll play a role in the future.

Why are irises a good way to identify humans?

They are very stable over time, and based on modern technology, they're "spoof-proof." So I can take a photo of your face, and, through complicated AI, I could fool Face ID, for example. An iris is more spoof-proof.

Advertisement

Note: A spokesperson for Tools for Humanity also directed BI to a blog post on irises. It notes that irises have a higher entropy — a degree of randomness or complexity — than fingerprints or faces. Since irises are protected by the eye, they're also less susceptible to change.

How does Worldcoin translate the complexity of an iris into a unique digital code?

We take a photo of your face and we take a photo of your eyes with the orb.

The orb does some checks, dependent on those checks, to check if you're a human and if you're alive, it then looks at the eye photo. What it does on the eye photo is create an iris code. It's not like some dystopian scanning thing — it's a very advanced camera.

This is where it gets into the technical parts. An iris code is not something that we scientifically came up with, but it's basically a binary of ones and zeros: 1, 1, 0, 0, 0, 1, 1, 0, 0. So it's an abstraction, a numerical abstraction of the surface of your eye, and everybody's eye is different.

The goal is one World ID per person. So we basically take the ones and zeros that represent someone's eye and check the backend. If this is not the first time that we're seeing them, we say, "No," you cannot continue because you already have an account.

Advertisement

If it is the first time, the orb takes the iris code and cryptographically processes the iris code. We take the ones and zeros and run them through cryptography that tears them apart into two separate codes that do not look anything like the ones and zeros. So it could literally be 5, 6, 7, 8, and the other one could be 1, 2, 3, 4. Individually, neither of those new codes looks anything like the iris code, nor can they be brought back to the iris code on their own.

Where are these codes stored?

We take those two pieces of code, and we store them in two different data stores. They're owned by two legally distinct companies, and we're adding a third in the coming months. Our hope is in the coming months to add many more. So we will break the iris codes into 20, 30, 40, 50 pieces — as many as we can do.

Our goal is that Tools for Humanity would not operate any of those databases.

What does this mean for users?

What we do on the orb is we wrap up the photos, a copy of the iris code, and a secure key — a private key, it's all encrypted — and we pass it back to the user's device, and it remains on their device.

This is basically to do a couple of different things. One, they should have a copy of their data. It's their data; it's not ours, and we don't want it. Two, the private key is how they actually communicate with the systems and other systems and services. That private key is their unique code for everything.

Advertisement

Is there a way to access someone else's code?

To get an iris code, you have to recombine all of those pieces. You have to know how to recombine them, and then the important part is you would have to have a photo of the original irises to be able to identify the code to the person.

But we never get the photos. We never get a photo of your face. We never get a photo of your eyes. We give them to the user. The one person that can access that information is the owner of the world ID — the user. If the user were to delete their own key on their phone, which you can backup to the Google Cloud or Apple iCloud, I couldn't even access the pieces of the code in the databases. So at that point, it's completely anonymized.

How can I use my code right now?

I will use Twitter as an example because it's near and dear to my heart. When you log into Twitter, you could use your username or password, but you could also use your Google email. Twitter, or any other service, could also enable login with World ID.

So, if I want to log into my Twitter account and I want to associate my World ID with my Twitter account, I would press the login button. Twitter would send a request to my device that I'm trying to log into my World ID.

My device would take my private key, wrap it up with the request from Twitter; it would encrypt it, would then get a piece of information from a public source, a public blockchain, which is the public key.

Advertisement

It would then take that information, and it would make another request to our databases, these broken-up pieces of information, and the request that it's making is, "Is this a unique human?" The answer is yes. It sends a "yes" back to my device, my device packages it up, and sends that to Twitter.

What is the goal of this technology?

Maybe the way to think about what we're doing is the protocol, which is the term that you'll see in the papers, it's basically like a standard. If you have an iPhone, it's got a USB-C charger. A bunch of tech companies get together, and agree on the standard so it's interoperable. We want the protocol to be the standard.

Why is this so critical in an AI age?

For World ID, privacy is the product. This extends to the entire project — from vision to principles and more. We are committed to enhancing people's privacy in the age of AI by leveraging cutting-edge cryptographic technology and developing new technology like Face Auth to further that mission. As AI continues to advance and open up incredible new opportunities and challenges, we hope to set a new standard for security, transparency, and giving people full control and choice over their data.

How might this technology be used for distributing universal basic income?

Our goal is to build the largest trusted network. When you have a very large trusted network for online digital transactions — and again, I have to stress when I think about digital transactions — it's not just money; it's all the things — you'll be able to do other things with that large network.

One of those things could eventually be UBI. Right now, what that looks like, I think, is too premature to tell.

Advertisement

Even Alex, our CEO, and Sam Altman have said different things over the years. It's evolving because we're learning more about what that might look like. I think building an infrastructure layer that would allow that to happen is at least one of the things that we believe is possible.

You are subscribed to notifications!
Looks like you've blocked notifications!
Next Article