The lack of privacy protection is the Original Sin of all public blockchains – from Satoshi’s original Bitcoin whitepaper down to the most cutting-edge, modular, and parallelized network that does 100 million transactions per second with a zeptosecond finality time.

Generally speaking, user privacy goes against the nature of public blockchains: For a public ledger to function, some transaction data must be shared with nodes and network participants. The shortcut to quickly getting these systems online is simply to just make everything public by default.

However, that ultimate transparency exposes users to surveillance, coercion, and unintended consequences like trade signal leakage. This is commercially unviable and corrosive of the right to determine one’s destiny. True self-custody cannot exist if users don’t control their data; privacy is about reinstating users’ freedom to select what they do and don’t reveal to the outside world.

Here are seven fatal flaws that are common in crypto privacy tools:

Sin 1 – Centralized Systems

In a decentralized world, centralization is sloth. It’s easier (faster and cheaper) to run a ledger on a bank’s internal SQL database than sending transactions on even the most performant blockchains. 

However, decentralization equates to resilience. It’s the reason crypto has any market value. Without it, users would be better off with centralized institutions’ speed and cost savings.

This is even more important for privacy protocols, where centralization means developers are giving themselves privileged access to users’ data.

Protocol creators should never give themselves admin keys that can freeze or deanonymize users. (RAILGUN uses mechanisms like Viewing Keys to provide non-discriminatory, user-controlled transparency where needed.) 

Another centralization vector is threshold multi-sigs, particularly for protocols seeking to bypass insecure bridges. Even when set up “properly,” a 3 of 5 multi-sig is arguably worse regarding trust assumptions than your neighborhood bank.

And when the multi-sig isn’t configured correctly….  

Sin 2 – Lust for Logging

Privacy tools should take every measure to ensure no tracking of user activity, particularly personally identifiable data such as IP addresses and browsing activity.

Privacy protocols should be designed with an all-encompassing philosophy that only uses a momentary lack of judgment to deanonymize users.

For example, Railway Wallet (which has integrated RAILGUN privacy tech) proxies RPC calls by default for all users so that even if someone isn’t using a VPN (which they should 🙁), their IP isn’t leaked to RPC nodes.

Sin 3 – Encrypted State

Why not make the entire system private? It’s tempting… but having a fully encrypted state is as undesirable, in some ways, as being fully public.

The encrypting state creates a black box where users and observers do no know what the dApp is doing. It eliminates the most significant security feature of blockchains: public auditability.

If the dApp is private, how do you verify that economics and actors are acting correctly? How do you respond properly to an exploit or malicious attempt if you don’t know if something has happened?

User privacy is good – and so is protocol transparency.

Sin 4 – Dependency on Specific Manufacturers

Being “trustless” means you don’t have to trust a third party (i.e., a company, agent or bank teller) to ensure a protocol works. A strength of zero knowledge-based encryption is it creates fewer dependencies, including on manufacturers.

Consider, for example, if you create a privacy system that relies on Software Guard Extensions built by Intel into their CPUs. The security of your system depends on a potential single point of failure – trusting Intel to have implemented its product correctly.

Intel’s incentives are to act appropriately, but relying on SGX creates a constant vulnerability and unnecessary assumption of trust. There are also gatekeeping-by-design considerations, as SGX requires specialized hardware that is relatively expensive, obscure and hard to maintain –. In contrast, a proof-of-stake validator can be run on a Raspberry Pi.

Sin 5 – Going rogue

Crypto privacy is a compelling narrative, but it’s not a strong enough value proposition to warrant building an entirely new blockchain or rollup (unless the specialty chain brings a strict technical innovation).

Privacy systems are most impactful when available on chains where users and financial activity exist. For better or worse, DeFi has congregated around Ethereum, EVM, and a few other environments like Solana. Solidity is the king and thus has benefited from the most security research.

Spinning up a novel execution environment and enticing developers and users takes time and often unsustainable incentives. Meanwhile, billions of dollars in value is already sitting on public chains desperately needing privacy.

Dedicated privacy chains also create additional security questions, such as requiring bridges – which have been demonstrated time and time again to be the least secure component of blockchain networks. Other concerns include centralization of consensus, validation and sequencers.

Sin 6 – Builder Complexity

Developers are often thought of as being geniuses (and some are). However, cryptography is difficult enough that forcing builders to learn and use a proprietary language, toolchain, or ecosystem is unnecessarily complex and counterproductive. 

Contracts written in languages like Solidity or Vyper are portable among networks supporting EVM. That’s not the case for Rust and other WebAssembly chains. They all have their own standards for runtime. From a builder standpoint, that means separate contract codebases need to be maintained for each chain despite them using the same language.

As a result, the product is less accessible.

Sin 7 – Immature Tech

“Magic Internet Money” is a genuinely excellent meme. However, crypto developers are building financial technology that has real-world consequences and handles real money.

Privacy tech has the double duty of taking the “realness of money” into account and “privacy” itself – i.e., it has to be secure against financial exploits AND anything that may deanonymize users. The significant body of existing academic research on the technology is there for a reason.

Lest you end up like IOTA, a tried-and-true axiom is “never roll your cryptography.”

Privacy tech, in particular, should be battle-tested and thought-through, with extensive audits from security firms, assessments from privacy advocates, pen testing by white hats, etc.

Otherwise, how can you expect people – especially the hoped-for new mainstream users – to risk their identity and money on a complex technological platform?


Public blockchains are “dox-by-design.” It’s no easy feat to build on-chain privacy systems while preserving the reasons to use crypto in the first place, such as auditability and decentralization.

A great resource for assessing the privacy levels of your chosen privacy tool is the Web3 Privacy Now initiative which have categorized and scored various crypto privacy tools. Check it out as an excellent first step toward safeguarding your online identity and your finances.

Source link