March 29, 2026 · Permissionless Technologies
Why We're Building Privacy Infrastructure From Scratch
How a stablecoin project, broken privacy tools, and a Vitalik paper led us to build modular privacy and compliance SDKs for Ethereum.
It Started With a Function Call
function freeze(address account) external onlyRole(FREEZER_ROLE)One line of Solidity. Sometimes it's called freeze. Sometimes blacklist. Sometimes addBlackList or setAssetFrozen. Every major stablecoin in circulation — USDC, USDT, PYUSD — has some version of it. Tether has blacklisted over 5,000 wallets holding more than $3 billion. Circle has frozen $109 million in USDC.
And freezing isn't the only problem. Tether's reserves — the collateral supposedly backing every USDT in circulation — went without a full independent audit for over a decade. What they published instead were quarterly "attestations" by BDO Italia: single-day snapshots with no disclosure of custodians or counterparties. In December 2025, S&P rated USDT's stability as "weak." Nearly a quarter of Tether's reserves sit in bitcoin, gold, secured loans, and corporate bonds — assets that can lose value fast. A stablecoin backed by volatile assets is a contradiction in terms.
We started Permissionless Technologies because we thought a stablecoin shouldn't need a permission slip — and shouldn't need you to trust an opaque balance sheet.
If it can be frozen, seized, or depends on a bank for collateral — it's not really a stablecoin. It's a bank deposit with extra steps. A different financial instrument wearing a decentralized costume. The issuer stands between you and your money, and the issuer can be compelled.
We wanted something else. A dollar-peg protocol. Purely on-chain collateral. No issuer. No admin keys. No blacklist. The smart contract is the counterparty — in both directions. Deposit collateral, receive dollars. Burn dollars, receive collateral. No relationship with us required.
That's what we set out to build. But the deeper we went, the more we realized the stablecoin was only one piece of a much bigger problem.
Everything Is Visible
During our research for UPD — the Universal Private Dollar — the privacy problem became impossible to ignore.
Transparency is blockchain's superpower. Every transaction verifiable. Every balance auditable. Every contract's logic open for inspection. That's the whole point — trustless verification. Nobody needs to take anyone's word for anything.
It's also the greatest vulnerability.
Your salary is public the moment it hits your wallet. A whale holding $50 million is a target for a $5 wrench attack — why bother hacking a private key when you can look up someone's balance and show up at their door? A business paying suppliers reveals its entire cost structure to competitors. A medical payment on-chain could reveal your diagnosis to anyone who cares to look.
This isn't a niche concern for cypherpunks. It's a structural limitation that prevents serious adoption. No CFO will put payroll on a chain where every competitor can see every line item. No fund will reveal its trading positions in real time. No individual with meaningful holdings wants their net worth on a public ledger.
So we looked at what existed. The options were worse than we expected.
Everybody Gets Mixed
Tornado Cash was the dominant privacy tool on Ethereum. It worked. You deposited ETH in fixed denominations — 0.1, 1, 10, or 100 ETH — and withdrew from a mixed pool. Your deposit went in one side. An equivalent amount came out the other. The link between sender and recipient was broken.
Simple. Effective. And completely indiscriminate.
That last part was the problem.
The technical limitations were real but livable. Fixed denominations only — want to shield 7.3 ETH? That's multiple deposits across multiple pools. No partial withdrawals. No private-to-private transfers. No composability — you couldn't build on top of it. Limited token support. Functionally, it was a one-trick pony. But the trick worked.
What killed Tornado Cash wasn't the technology. It was the total absence of any compliance mechanism.
In August 2022, the U.S. Treasury's Office of Foreign Assets Control sanctioned Tornado Cash. The allegation: the protocol had been used to launder over $7 billion in cryptocurrency. Among the users — North Korea's Lazarus Group, which had funneled hundreds of millions in stolen funds through the mixer.
In May 2024, co-founder Alexey Pertsev was sentenced to 64 months in a Dutch prison. In August 2025, Roman Storm was convicted in New York of conspiring to operate an unlicensed money transmitting business. A third co-founder, Roman Semenov, remains a fugitive.
Clean funds and sanctioned funds entered the same pool. They exited through the same mixer. The protocol couldn't tell the difference — because it was never designed to. There was no screening, no filtering, no way for a legitimate user to distance themselves from a state-sponsored hacking group. Everyone got mixed together, and everyone was tainted by association.
Tornado Cash proved one thing beyond doubt: the demand for on-chain privacy is massive. Billions of dollars flowed through it. People wanted this.
But it also proved that privacy without any compliance mechanism gets your developers arrested and your protocol sanctioned.
Compliance isn't the enemy of privacy. The absence of compliance is what got privacy tools banned. We needed something different. And in late 2023, a paper appeared that looked like exactly that.
The Paper
In September 2023, Vitalik Buterin co-authored a research paper with a deceptively simple idea.
What if, instead of mixing everyone together and hoping for the best, you let users prove they belong to a set of approved participants — without revealing who they are?
The mechanism: Association Set Providers (ASPs). An ASP maintains a curated Merkle tree of approved addresses — accounts that have passed sanctions screening, KYC checks, or whatever criteria the ASP defines. When you withdraw from the privacy pool, you generate a zero-knowledge proof: "My deposit is in this approved set." The verifier learns that someone legitimate made this withdrawal. Not which someone.
This was the inverse of how Tornado Cash failed. Instead of mixing everything and letting the courts sort it out, you prove membership in a clean set. Positive membership. "I AM good" — not "I'm NOT bad." The distinction sounds subtle. It's not. We'll come back to it.
In April 2025, 0xbow launched Privacy Pools on Ethereum mainnet based on this research. We watched closely.
The idea was right. But when we looked under the hood, we realized we couldn't just use it.
Close, But Not Ours
Two protocols sat closest to what we needed. Both fell short. For different reasons — but the result was the same. We couldn't build on either.
Privacy Pools had the right compliance model, but the implementation was tightly coupled. The ASP logic lived inside the pool contract. Not a separate module. Not a reusable package. If you wanted ASP-style compliance for your own token or your own pool, you'd have to fork the entire system and carry the maintenance burden.
Withdrawals were compliant. Deposits weren't — they're fully visible on-chain. Circuit changes required new trusted setup ceremonies, with all the coordination and trust assumptions those entail. And at launch, the maximum deposit was 1 ETH.
A valid proof of concept. Not infrastructure you can build an ecosystem on.
RAILGUN went further than anyone. Full UTXO privacy — sender, recipient, and amount all hidden. Multi-chain support. A working product with real volume. In terms of raw capability, it was the most complete privacy solution on Ethereum.
But three things made it a non-starter for us.
First, the code is UNLICENSED. Not MIT. Not Apache. Not GPL. Literally SPDX-License-Identifier: UNLICENSED in every contract file. You cannot legally fork it, modify it, or build on it without explicit written permission from the RAILGUN DAO.
Second, a 0.5% round-trip fee. Shield your tokens: 0.25%. Unshield: another 0.25%. On a $100,000 stablecoin transaction, that's $500 gone. For a stablecoin — where people move in and out frequently — the fees compound fast.
Third, and most important: the compliance model. RAILGUN uses Private Proof of Innocence (PPOI). The idea sounds reasonable — prove your funds are NOT from a known bad actor. But the mechanism has a critical flaw.
PPOI checks happen at deposit time. If you're flagged — correctly or incorrectly — your funds are permanently excluded from the system. No remediation. No appeal. No re-proofing against updated data. Once excluded, always excluded. And researchers identified a straightforward bypass: transfer stolen funds to a clean wallet first, then shield from there. The PPOI check passes because the clean wallet has no history. The system catches the obvious cases and misses the sophisticated ones.
In January 2023, the FBI alleged that North Korea's Lazarus Group used RAILGUN to launder over $60 million stolen from the Harmony Bridge hack. Whether the specifics of that allegation are precisely accurate or not, it revealed something important: the protocol's compliance story wasn't strong enough to prevent it.
We liked the ASP idea from Privacy Pools. We respected what RAILGUN built. But we couldn't use either. So we started extracting the parts that worked and rebuilding the parts that didn't.
Proving You're Good
We took the ASP concept and ripped it out of the pool.
Made it standalone. Made it pluggable. Made it something any protocol could drop in without forking an entire privacy system.
Universal Private Compliance (UPC): a standalone SDK where any protocol can add zero-knowledge compliance verification. At the center is a single Solidity interface — IAttestationVerifier. Implement it however your compliance model requires.
What does that look like in practice? Chainalysis operates an OFAC sanctions screening ASP. A crypto exchange runs a KYC ASP using their existing identity verification stack. A government agency operates a sanctions ASP. A DAO runs a community membership ASP. Different compliance needs, different operators, different criteria — all flowing through the same interface. All verified on-chain with zero-knowledge proofs. The verifier learns "someone in the approved set authorized this transaction." Not who.
This is the difference that matters: positive membership versus negative exclusion. PPOI says "prove you're NOT bad." UPC says "prove you ARE good." With positive membership, if your status changes — you complete KYC, you're removed from a sanctions list, a false positive is corrected — you re-prove against an updated set. Your funds aren't permanently tainted. With PPOI, there's no second chance.
And unlike Privacy Pools, the compliance layer is completely decoupled from any specific pool or protocol. It's infrastructure, not a feature of one product.
We had compliance. But the cryptography underneath it needed to be better than what everyone else was using.
The Quantum Clock Is Ticking
Almost every privacy project on Ethereum uses BN254. It's the elliptic curve behind BabyJubJub, the default in circom and snarkjs, the path of least resistance. It works. It's fast. But its pairing-based security sits around 100 bits. For a system that people will trust with real value, that margin is thinner than it should be.
We chose BLS12-381. 128-bit security — the NIST standard. It's the same curve Ethereum's beacon chain uses for consensus. If it's good enough to secure the network itself, it's good enough for privacy proofs. And since the Pectra upgrade in May 2025, EIP-2537 precompiles make BLS12-381 operations gas-efficient on-chain. No more prohibitive verification costs.
For the proof system, we went with PLONK over a universal setup. No per-circuit trusted setup ceremony. No "who ran the ceremony and did they actually delete the toxic waste?" audit question. The setup is deterministic and reproducible.
But SNARKs — all of them, regardless of curve — have a structural ceiling. They rely on elliptic curve assumptions. A sufficiently powerful quantum computer breaks them. For everyday transfers, where the data exposure window is short, this is an acceptable risk. The transaction is settled before any quantum attacker could exploit it.
For high-value, long-term storage, it's not acceptable at all.
So we built a second mode: a Circle STARK verifier, written in native Solidity. Post-quantum secure. No trusted setup of any kind. Higher gas cost — roughly 20 million gas versus 200,000 for a SNARK verification — but the threat model is entirely different.
"Harvest now, decrypt later" is not a theoretical concern. State-level adversaries are already collecting encrypted data from public blockchains, banking on future quantum capabilities to decrypt it. A STARK-based vault protects against that scenario. Even if quantum computing arrives in ten or twenty years, the proofs remain sound.
Both systems run side by side. SNARKs for everyday shielded transfers — fast, cheap, practical. STARKs for long-term high-value storage — heavier, but quantum-resistant. Not a migration path. Different tools for different threats. All at NIST-required 128-bit security. All using Ethereum precompiles.
We had compliance. We had the cryptography. Now we had to decide what to do with it.
Your Brand, Your Rules, Our Cryptography
We could have built one product. A privacy pool with compliance built in, a stablecoin on top, a single frontend, and called it a day.
Instead, we built SDKs.
Every component is an independent, publishable package under the @permissionless-technologies npm org:
UPP — Universal Private Pool. Shields any ERC20 token into a shared Merkle tree. One pool, one anonymity set for all tokens. More users means more privacy for everyone. Shield, transfer privately, merge notes, withdraw — with optional ASP compliance at each step. Viewing keys give granular, per-transaction audit access — share what you need to share with your auditor without exposing everything. And a ragequit mechanism guarantees that the original depositor can always exit to their own address, even if every ASP on the planet refuses them. No hostage situations. Ever.
UPC — Universal Private Compliance. The ASP framework we just described, fully decoupled from any pool. Pluggable verifiers, pluggable storage, on-chain registry. Works with UPP. Works with anything else.
UPD — Universal Private Dollar. The stablecoin we originally set out to build. No freeze function — not as a removed feature, but as something that was never implemented. No admin keys. Overcollateralized by stETH, fully on-chain and verifiable by anyone. A standard ERC20 that works everywhere: DEXes, lending protocols, bridges, wallets.
UPH — Universal Private Helpers. The cryptographic primitives underneath all of it — M31 field arithmetic, Circle STARK verifiers, Merkle trees, BLS12-381 precompile wrappers, Poseidon hash functions. What OpenZeppelin did for contract patterns, but for zero-knowledge primitives.
Why SDKs instead of a monolith? Because a monolith forces everyone into one UX, one chain, one compliance model. With SDKs, a wallet integrates UPP and adds a "Shield" button next to "Send." A DEX offers private swaps using the same shared pool. A bridge does cross-chain privacy. A stablecoin issuer adds privacy without modifying their token contract. A compliance firm operates an ASP through UPC and sells compliance-as-a-service to multiple protocols simultaneously.
We're eating our own cooking. upd.io is our consumer product — integrating all four SDKs into a polished interface where you can mint stablecoins, shield tokens, swap privately, and stake. But the SDKs are what we're really building. The infrastructure layer that others build on.
TypeScript. ESM. viem and wagmi. No ethers.js. No CommonJS baggage.
We know how this sounds. So let us talk about how we build it.
Security Is Not a Feature — It's the Process
Ambitious scope means nothing if the code can't be trusted. We're building financial infrastructure. The bar isn't "it works." The bar is "it works and we can prove why."
That's not a slogan. It's how we actually develop.
Every smart contract runs through Foundry fuzz testing — 1,000 randomized runs per test in CI, hunting for edge cases a human would never think to write. Across our repos, we maintain over 5,600 lines of test code against roughly the same amount of source — a test-to-source ratio that most DeFi projects don't come close to. Every state-changing function is guarded by OpenZeppelin's ReentrancyGuard. Every token transfer goes through SafeERC20. Access control is role-based via OpenZeppelin v5.6 — the latest stable release, not a pinned legacy version.
The UPP SDK has already been through an independent security audit. The findings were addressed. We're not done — formal verification of critical contract invariants is on the roadmap, and we're evaluating tools like Certora and Halmos to mathematically prove properties that fuzz testing can only approximate.
On the TypeScript side, strict mode isn't optional — it's enforced with noUncheckedIndexedAccess, noImplicitReturns, and every other strictness flag the compiler offers. If the type system can catch a bug at compile time, we'd rather it did.
Our Solidity is pinned to 0.8.29 — the latest stable compiler. We use custom errors over string reverts. We extract logic into external libraries rather than bloating contract bytecode, because a smaller attack surface is a safer attack surface. Deployments use deterministic CREATE2 so every address is predictable and verifiable before a single transaction is sent.
Eight ZK circuits — transfer, withdraw, merge, join-split, ASP membership — each with dedicated test harnesses for both PLONK and STARK proof verification. The circuits don't just get tested in isolation. They get tested end-to-end against the on-chain verifier contracts.
We're security-concerned people building security-critical software. That means we move carefully, test obsessively, and treat every external dependency as a liability until proven otherwise. Our dependency tree is minimal by design — @noble/curves for elliptic curve math, snarkjs for proof generation, OpenZeppelin for battle-tested contract patterns, and not much else.
Is all of this enough? No. It's never enough. That's the point. Security isn't a milestone you reach — it's a discipline you maintain. And we're looking for people who think the same way.
Come Swing With Us
A non-freezable stablecoin. A universal privacy pool for any ERC20. A modular compliance framework. Post-quantum cryptography. All shipped as independent, reusable SDKs — built by a team that treats every line of code as a potential attack vector.
We know that every single component exists because we hit a wall and had to build the next layer ourselves. The stablecoin needed privacy. Privacy needed compliance. Compliance needed better cryptography. The cryptography needed to be packaged so others could use it. This isn't a roadmap dreamed up in a slide deck. It's what happened when we kept pulling on the thread.
We're looking for developers who read the section on BLS12-381 and thought "finally." Protocol designers who see the ASP model and immediately think of three use cases we haven't considered. Cryptographers who want to poke holes in our STARK verifier. Security researchers who want to find what our fuzz tests missed. Compliance thinkers who can tell us what we're getting wrong.
Not to "join the revolution" — that's not what this is. It's an invitation to review the code, build on the SDKs, challenge the assumptions, and break things before anyone else does.
Privacy and compliance are not opposites. We're building the proof.
Find us on GitHub, explore the documentation, or come talk to us on Telegram.