Contrary to the popular belief that data privacy is about making sure your data isn’t shared, Eli Wachs believes that data sharing, in fact, has significant benefits . . . when used correctly. Today, Wachs, 24, is the CEO of Footprint, a company that is rethinking how we store and share personal data online.
Launched in August by Wachs and Alex Grinman (cofounder of Krypt.co), with $6 million in Seed funding, Footprint aims to be the Apple Pay of identity. Instead of having to fill out multiple forms when applying for a mortgage, renting an apartment, or opening a credit card, users fill out only one form on Footprint. Footprint stores the data securely and uses biometric verification to confirm people’s identities. Then, for companies that need personal information, such as a bank, Footprint charges a fee in order to access and store only the personal data they need.
Wachs has always believed in the power of technology to make the world a better place. As a high school freshman, he started High School HeroesX, which ran competitions for high school students to work on social challenges on a local level, such as increasing high school graduation rates in economically disadvantaged schools and improving lax food-safety legislation. As a freshman at Stanford, he started a company to track flu strains and later became fascinated by Facebook’s sprawling Cambridge Analytica scandal. In 2022, a couple years after graduating from college, he and Grinman started Footprint.
Fast Company chatted with Wachs about what he believes the future of data privacy should look like. In conversation, Wachs is expansive but also realistic. He has grand visions for the future, which are often counterintuitive—but isn’t that how the future is created? Still, while he didn’t always have answers about how to get to the future or implement change, he wasn’t afraid to admit when he didn’t have an answer, or even when the answers are unknowable.
Fast Company: How did you develop your philosophy around data privacy?
Eli Wachs: The more I studied data privacy, the less I agreed with the conventional wisdom. Everyone was talking about, How do we stop using data? But my biggest complaint with Google wasn’t what they did with my data, but what they weren’t doing with it. For example, Google’s probably the best predictor in the world of getting Parkinson’s because it can gauge finger tremors from your type rate over time. I had a family member get sick and I got really obsessed with the idea of, How do we help people control their data?
I didn’t necessarily know what that meant. I was glad lawmakers are paying attention to privacy with GDPR in Europe and CCPA in America, but at the same time I don’t think there should be a trade-off between usability and privacy for people on the internet. And that’s how I got really interested in this space: I wanted to remove that trade-off.
FC: What’s your vision for creating an internet where privacy doesn’t affect useability?
EW: At the start of the pandemic, I downloaded my data from everywhere. It was fascinating. You’d be shocked by how much Facebook and Twitter have but how little they know—and that’s because there’s a level of trust we don’t have in them. Facebook can’t ask you the 5 or 10 questions it needs to make a really good advertisement, so they end up collecting a lot of random things.
Now, to take a step back, you’re not asking about social media, but more foundational uses of your identity, such as opening a credit card. Every time you do that, you have to give away personal information like your birthdate and social security number; but if a company gets hacked, your information is leaked. But actually, those companies don’t need very much. They just need to know: Are you who you say you are? Are you a trusted person? Are your address and social security number valid? And they need to be able to access this information if they need it. There isn’t a need for everyone to store a duplicate copy. The analogy I think of is banks in the 1850s that were all using different bank notes. We’re doing the same thing with identity.
I believe we need a trusted third party to answer those kinds of basic questions and reduce the number of silos holding your data. So, you fill out your information once, companies can access what they need when they need it, and then you can verify your identity using biometrics like a Face ID. This reduces the number of places your data is stored. Also, if everything is centralized, if someone else tries to claim your identity, verification will make it clear that they are lying.
FC: If there’s only one company that’s the main vault for personal information, wouldn’t it become a target for hackers?
EW: So, definitely anyone who says they built something that’s impossible to hack is lying—all you can do is make hacking logistically impossible and expensive.
However, with the idea of there’s only one of me—if someone tries to steal my identity, the second we both verify, it’ll be clear who is lying. We’re working with Apple for the Face ID interoperability across the internet and multiple devices. We’re also using Amazon’s Nitro Enclaves, which is a very secure environment. It also creates a record of whenever data is accessed, so we can see if something suspicious happens.
Today, how identity theft works is there are batch files sold on the dark web. Then fraudsters create thousands of fake accounts. But now they’ll need thousands of faces to unlock those accounts. It just adds levels of logistical complexity and friction.
FC: This leads me to the opposite end of the spectrum—when you have a centralized data repository, how do you avoid a surveillance system in which one company has a lot of power to restrict or enable access for other people?
EW: It’s a good question. It’s easy to be naïve about wanting to make the world a better place. Facebook was trying to make the world a better place, and maybe it was doing that for awhile until it wasn’t. We can never be confident we won’t end up in the same place.
But I do think the tech that didn’t deliver on its promise was when things were black boxed that didn’t need to be. Targeted advertising on Facebook isn’t that intense: What they did wasn’t that terrible. I still think Facebook really messed up at the beginning [by] not just saying, “We don’t sell your data; we sell access to advertise to you.” Meanwhile, Apple did a great job.
I think a lot of it starts with transparency. My investor updates are really long: They are philosophy, they’re often not really even about Footprint. They’ve got everything from empathy to, like, women’s privacy rights. It’s dreams and failure. I try to talk about failure because apparently it’s taboo in the valley. That even extends to being public about our pricing and how we make our money.
But we can talk a big game, and then what happens if we get a subpoena to hand over location data? I still think giving people control can mean also giving them deletion rights. We spend a lot of time internally talking about how to be a positive force that can unblock different opportunities, not get someone blacklisted from the financial system. But to answer your question, we can never be too confident.
FC: What makes you say Apple did a great job?
EW: Apple [is] the best privacy marketers in the world. Facebook took terms that people were talking about, and just didn’t talk about them. Apple took the terms and defined them narrowly, but people never heard the definitions for them. For example, Apple released something called ATT, which is kind of what people may know is the thing that pops up and says, “Do you want this app to be able to track your data?”
For Apple, privacy essentially is being able to opt out of the tracking data. What’s interesting is that Apple totally removed themselves from that pop-up. If you don’t want Apple to track you, you have to go four layers deep into settings. Apple thinks tracking is if anybody else from the outside is following you around your iPhone, but Apple thinks that if they follow you around your iPhone, that’s okay. I don’t think that makes them bad people.
I think that some of the best tools that Apple’s ever built are with my data. I love Apple Pay. It’s nice to not touch things to pay during COVID-19 times. Apple Health has inspirational success stories of people getting into car crashes and calling ambulances. To me that’s what privacy looks like. We trust Apple. We’re okay with them having our data because we trust them to do good things with it. So, yes, they’ve defined privacy differently than Merriam-Webster’s dictionary, but I think that’s okay. To me, privacy is about having trusted parties that know what to do when I need help.
FC: In some ways, aren’t the companies that don’t talk about what they are capable of doing with our data the most trusted?
EW: You’re definitely right that this is a paradox. The companies that have the highest NPS scores collect the most data. The companies that have been able to build the best services like Google and Apple have the most data.
But I think people are actually more willing to have these relationships if they realize what’s being exchanged or what’s not being exchanged. I think companies would actually have more users if they were more open about what’s actually going on.
FC: You also mentioned wanting to put control of people’s data in their hands and provide more insights to them. How will you do that?
EW: So, it’s a really fair call-out question. You’re right, we kind of have step one pretty clearly laid out. I spent two years thinking about Footprint before we started it, and most of my thinking was in the phase two of people in control of their data, and we’re doing different analyses for them.
A lot of privacy companies haven’t caught on because they charge users for the data and require people to go out of their way. Footprint doesn’t do either of these. We, instead, can essentially bootstrap a consumer-privacy company by building an enterprise security company as new regulations are really putting those two goals in common. And no one has to use free time to make a Footprint. When you create a fintech account that you need, it creates a Footprint by default. After that, in time, the hope is to see a lot of ecosystems emerge around the product to help people.
Sometimes, in companies, the mission doesn’t match the first product and you have to contort yourself to get there. But I don’t see another way to get to that phase.
FC: What keeps you up at night?
EW: I don’t want an internet where we all lose trust and we stop interacting with services. This isn’t me saying go sign up for everything on Facebook. Data brokers are terrible creatures. Some of the categories they have for targeting people are really awful: recently divorced people, people who had children. It’s disgusting. That shouldn’t happen. That’s not the data I think should be out there. But I do have the optimism that there are a lot of powerful things we can do if we have trusted third parties.