Dreams in the Gulf Stream: Solana’s mempool-less protocol
Introduction
Blockchains have many components, and one of the most well-known and vital parts is mempools, or memory pools. This is partly because the two largest blockchains, Bitcoin and Ethereum, utilize mempools.
Consequently, the presence of a mempool has become synonymous with the concept of a "real" blockchain, leading some to argue that a blockchain cannot function without one. However, this conventional approach has inherent limitations: transactions residing in mempools are not processed quickly enough, and depending on the fees paid, could remain in the mempool for extended periods before being included in blocks. This delay creates bottlenecks, particularly during times of high network congestion, and exposed transactions to Maximal Extractable Value (MEV).
The Solana blockchain took a new approach in removing the mempool from their blockchain entirely and thus increased scalability, reduced block confirmation times and made for an overall easier and smoother user experience. In this article, we will cover what mempools are, how they are used, how Solana’s approach differs from other blockchains and what we can learn from it.
A brief background of mempools will help one better understand and appreciate Solana’s different take on this.
So what is a mempool?
A mempool, also known as a memory pool, transaction pool, or transaction queue, is an in-memory data structure within blockchain nodes that temporarily stores pending transactions before they are included in blocks. This temporary storage acts as a waiting area where transactions reside until they are validated and incorporated into the blockchain by miners (in the case of Proof of Work chains) or validators (in the case of Proof of Stake chains).
The mempool is a crucial component of the blockchain ecosystem, ensuring that all submitted transactions are recorded and processed in an orderly manner.
The prioritization of transactions within a mempool is primarily based on the fees attached to each transaction. Transactions with higher fees are given priority and are included in the next available block more quickly than those with lower fees. This fee-based prioritization mechanism incentivizes users to pay higher fees to expedite the processing of their transactions, especially during times of network congestion when the number of pending transactions can be significant.
This fee-based system ensures that miners or validators are compensated for their work in maintaining the network, aligning economic incentives with network security and efficiency
Prior to a transaction being added to the mempool, it must first pass a series of validation checks performed by a node. These checks ensure that the transaction conforms to the standard transaction specifications set by the blockchain protocol. If a transaction meets all the necessary criteria, it is added to the mempool; if it fails any of the checks, it is disregarded.
Sample checks done by nodes prior to transaction inclusion in the mempool includes but isn’t limited to:
● Is the signature valid (i.e., the transaction has not been tampered with)?
● Are all required elements – such as the To: and From: addresses – present and valid?
● Is the From: address an external account (and not a contract account)?
● If the transaction transfers value, does the From: address balance cover the value?
● Does the transaction have a gas limit that’s below the block limit?
● Does the transaction fit into the local mempool?
The mempool in a blockchain network functions much like a waiting room in a medical office, where pending transactions are temporarily held before being included in a block. Before a transaction can enter this waiting room, it must first pass through a reception area where a receptionist, akin to a node in the blockchain, checks to ensure that all necessary requirements are met. This preliminary screening ensures that only valid and properly formatted transactions are allowed into the mempool.
Just as a receptionist verifies a patient's appointment details and personal information, a node performs various validation checks on a transaction. These checks include verifying the transaction's signature to ensure it hasn't been tampered with, confirming the presence and validity of required elements such as the "To:" and "From:" addresses, and ensuring the sender has sufficient funds to cover the transaction amount. Only after these checks are satisfactorily completed is the transaction added to the mempool, similar to a patient being admitted to the waiting room after passing the reception checks.
Once in the waiting room, patients must wait for their turn to see the doctor, which parallels how transactions wait in the mempool until they are picked up for inclusion in a block. This waiting period can vary depending on the transaction fee paid and the current level of network congestion. Higher-fee transactions are often prioritized, much like how urgent cases might be seen by a doctor sooner.
In these blockchain networks, there isn't a singular, global mempool; rather, each node maintains its own mempool. This means that the set of pending transactions can vary between nodes, as each one independently validates and stores transactions before they are included in a block. To facilitate better coordination and synchronization, Bitcoin Improvement Proposal 35 (BIP0035) introduced and allowed a node to expose its mempool to other nodes, enabling them to share information about pending transactions.
In blockchains like Bitcoin and Ethereum, the protocol used for propagating transactions is known as a gossip protocol. This protocol is a distributed propagation method where each node periodically sends messages to a subset of other random nodes.
In blockchain-specific implementations, this propagation involves the use of a hashed probabilistic data structure called a bloom filter. The bloom filter efficiently represents a hashed value for each node's mempool, which when compared with other node’s hashed values, helps identify any extra or missing transactions from other nodes, allowing for quick synchronization.
In summary, the use of mempools, while integral to the functioning of many blockchains, presents scalability challenges. Transactions must wait in line, often prioritized by the fees attached to them, leading to slower overall processing times. This can result in significant delays, especially for lower-fee transactions, which may languish in the mempool for an indeterminate period and impacts user experience.
However, mempool-based blockchains have certain advantages; namely, they ensure orderly transaction processing, maintain the sequence and integrity of transactions as they are included in blocks. This methodical approach provides a reliable and predictable framework for transaction verification and inclusion.
What is Gulf Stream protocol?
One of the eight key innovations of the Solana blockchain is Gulf Stream, a transaction caching and forwarding protocol that eliminates the need for a mempool. As covered above, traditional blockchain systems typically use a mempool to temporarily hold pending transactions before they are included in a block.
Gulf Stream revolutionizes this process by forwarding transactions directly to the current validator leader and the next few validator leaders, thereby making the mempool redundant in this network. This method enhances transaction throughput and reduces latency.
Each elected validator leader is responsible for producing four consecutive blocks within a limited timeslot of approximately 400 milliseconds. By preemptively caching transactions with the current and upcoming validator leaders, Gulf Stream ensures that these leaders have immediate access to pending transactions, allowing them to quickly include these transactions in the blocks they produce.
The seamless propagation of transactions from one validator to the next is facilitated by another Solana innovation called Turbine, but we won’t go into it as the functioning of this mechanism is beyond the scope of this article.
Turbine plays a crucial role, in tandem with Gulf Stream, to ensure that transactions are rapidly and securely propagated across the network.
The knowledge and selection of current and upcoming leaders via a leader schedule, as well as the known quantity of consecutiveness of block production, allows transactions to be forwarded directly to those validator, as opposed to transactions being added to a mempool awaiting inclusion in the next block. This process significantly enhances transaction processing efficiency.
A block is considered “final” when an additional 31 blocks are built on top of it. This concept of finality is important as it provides certainty that the transactions within the block are permanently recorded on the blockchain and cannot be altered or rolled back.
The proactive transaction forwarding mechanism of Gulf Stream is further strengthened by a stake-weighted validator selection process for transaction processing.
In this system, validators with the most stake (funds locked up with a validator to ensure the security of the Solana network) are given priority for processing transactions. This prioritization ensures that validators who have the most to lose, and are thus highly incentivized to maintain network integrity, are at the forefront of transaction validation.
Additionally, the stake-weighted selection process acts as an effective Sybil resistance mechanism, which is critical for preventing denial-of-service (DoS)-type attacks on the Solana network by deterring malicious actors who might try to flood it with excessive transactions.
Conclusion
In this article, we covered the history of mempools, their function, and how they work in various blockchains. We also explored how Solana’s different approach helps make it the fastest, most secure, and user-friendly blockchain.
Gulf Stream alone wouldn't achieve its significant impact; hence, Solana Labs introduced eight different innovations that, working in concert, create the blockchain's current performance and efficiency. These innovations collectively contribute to Solana's unique capabilities.
Not all blockchains need a mempool, as Solana has demonstrated for years. In fact, some Directed Acyclic Graph (DAG)-based protocols also operate without a traditional mempool, further highlighting the diverse approaches to blockchain architecture.
Blockchain consensus mechanisms
There seems to be a misunderstanding, or perhaps an acceptance for the sake of simplicity, of what consensus mechanisms blockchains use.
Everywhere you read on the internet, they say a form of the following “Proof of Stake (POS) is a consensus algorithm for XYZ chain” or “Proof of Work (POW) is a consensus algorithm for XYZ chain”.
But Proof of Stake (POS) and Proof of Work (POW) aren’t exactly consensus mechanisms (atleast not on their own).
POW & POS are mechanisms to prevent Sybil attacks. A sybil attack is described as a malicious attack that involves forging multiple identities to gain an undue advantage within a network
Here is an example overview of the various POW & POS blockchains, the sybil resistance mechanisms and consensus mechanisms they use.
Blockchain | Sybil mechanism | Consensus mechanism | |||
---|---|---|---|---|---|
Bitcoin | POW | Nakamoto Consensus/Longest chain rule | |||
Ethereum | POW | Longest chain rule variation | |||
Cosmos | POS | tendermint | |||
Solana | POS | Tower BFT | |||
Avalanche | POS | Snow |
Furthermore, it seems most people think POW originated with Bitcoin……and that POS originated with the dozen or so POS chains that currently, and formerly, existed.
Long before the first crypto POS coin i.e. PeerCoin, computer scientist Wei Dai released b-money, which could arguably be the first instance or origin of proof of stake :
“In the second protocol, the accounts of who has how much money are kept by a subset of the participants (called servers from now on) instead of everyone.
These servers are linked by a Usenet-style broadcast channel. The format of transaction messages broadcasted on this channel remain the same as in the first protocol, but the affected participants of each transaction should verify that the message has been received and successfully processed by a randomly selected subset of the servers. Since the servers must be trusted to a degree, some mechanism is needed to keep them honest.
Each server is required to deposit a certain amount of money in a special account to be used as potential fines or rewards for proof of misconduct. Also, each server must periodically publish and commit to its current money creation and money ownership databases. Each participant should verify that his own account balances are correct and that the sum of the account balances is not greater than the total amount of money created. This prevents the servers, even in total collusion, from permanently and costlessly expanding the money supply. New servers can also use the published databases to synchronize with existing servers.“ (emphasis mine)
And long before Bitcoin came about with POW, bit gold created by Nick Szabo, created the first blueprint of a currency that relied on proof of work, which was a concept invented by Noar and Dwork in 1993 as a mechanism to thwart denial of service attacks and spam :
”Here are the main steps of the bit gold system that I envision:
(1) A public string of bits, the "challenge string," is created (see step 5).
(2) Alice on her computer generates the proof of work string from the challenge bits using a benchmark function.
(3) The proof of work is securely timestamped. This should work in a distributed fashion, with several different timestamp services so that no particular timestamp service need be substantially relied on.
(4) Alice adds the challenge string and the timestamped proof of work string to a distributed property title registry for bit gold. Here, too, no single server is substantially relied on to properly operate the registry.
(5) The last-created string of bit gold provides the challenge bits for the next-created string.
(6) To verify that Alice is the owner of a particular string of bit gold, Bob checks the unforgeable chain of title in the bit gold title registry.
(7) To assay the value of a string of bit gold, Bob checks and verifies the challenge bits, the proof of work string, and the timestamp.” (emphasis mine)
Perhaps, as stated earlier int his post , it’s common nomenclature to call POW/POS “consensus protocols” for simplicity, but the point i wanted to bring across is:
Proof of Work/Stake alone does not consensus make
Killer applications for Blockchains thus far
Blockchains, and cryptocurrencies by extension, have captured the imagination of almost everyone. Hardly a day goes by without Bitcoin being mentioned in the news.
By and large, most things in this industry are based in speculation, but if you look abit further out and abit deeper in, you will see the seeds of something incredible happening before our very eyes!
In this short article, I highlight some of the “killer products” for blockchains, beyond the speculative trading of Bitcoin. Coincidentally enough, some of these use-cases fly in the face of Bitcoin maximalists who posit that blockchains are only good as money and shouldn’t be used for anything else - the derision from this group of maximalists is always “you might as well use AWS for X,Y,Z“, but as you will see, using AWS for some of these functionalities misses some of the unique cryptographic properties offered by blockchains.
Here are, in my opinion, some of the main themes and killer products/applications of blockchains thus far:
1. Non-state issued Money (Bitcoin, stablecoins)
First breakthrough product for blockchains was money (Bitcoin). To be more specific, non-state sponsored money.
Bitcoin has gone on to evolve as a store of value, and another blockchain-based money product has taken over - stablecoins. Predominantly built on the Ethereum blockchains, stablecoin issuers have expanded to other blockchains such as Tron, Solana, etc.
These privately created monies, pegged to Fiat currencies, have blossomed and become the life-blood of the crypto markets.
The BIS (Bank for International Settlements), a kinda central bank for central banks, even posited that these stablecoins can co-exist with CBDCs (central bank digital currencies).
Make no mistake - Bitcoin, and by extension stablecoins, have pushed the digitization of banks faster than anything.
Akin to how Tesla pushed the electrification of vehicles- at first they laughed, then scorned and now every car company is suddenly speaking as experts, as if they always knew and had plans for electrification - utter nonsense! The same is the case with digitization - COVID may have been the fuel, but Bitcoin, and broader crypto ecosystem, was/is the fire!
2. Fundraising (ICO, IEO, IDO)
A second breakthrough product for blockchains was fundraising mechanisms which became all the craze in 2017. This became an alternative to IPOs for companies to raise money from the public and an extension of crowdfunding, for projects to raise money.
EOSIO blockchain developer Block.One raised over $4 billion in their year-long ICO. There have since been variations of the ICO model, namely:
ICO - initial coin offering
IEO - initial exchange offering
IDO - initial dex offering
The raises range from hundreds of thousands of dollars to hundreds of millions. It’s crowdfunding on epic proportions.
3. Provable ownership of digital goods (NFT)
Another killer product for blockchains emerged during the same phase as the fundraising - provable ownership via means of games like cryptokitties.
NFTs have grown to be a wildly popular killer product for blockchains, especially in 2021. Cryptographically provable ownership of digital goods (art, games & items) will continue to grow in popularity, outside the niche crypto ecosystem. We have seen art auction houses such as Sothebys list NFT art that gets sold for millions.
Outside of the niche crypto ecosystem, NFTs are by far the biggest killer product. Dapper Labs, the company behind Cryptokitties, which also created the NFT standard on Ethereum, also have an NFT based platform that is huge outside of crypto - NBATopShot, among others. They create officially licensed digital collectibles, which have become absolute hits among sports fans and stars alike. Dapper have also created their own blockchain - Flow.
Patents as NTFs is the next logical extension of this. Music, art and many more - all digital content that could be on chain, will be on-chain.
4. Decentralized Financial Services (Defi)
While Bitcoin & stablecoins, as money, was the killer use case for blockchains, it wasn’t until circa 2020 when Ethereum-based Defi came along that another killer product was realized - financial services such as lending, borrowing, market making, etc - some key primitives.
Defi, while for the most part currently speculative, has painted a picture of the world to come - financial services without the middle man, relying on smart contracts. A digital wallet essentially becomes your bank - akin to what Mpesa and others around the world have already done. This continues to be a huge deal.
Products such as Compound and Aave have ballooned to be massive borrow/lending protocols utilizing billions of dollars.
Dexes (decentralized exchanges) also became huge, as of the financial services primitives allowing for listing & market marking for almost anything imaginable. Uniswap is big (within crypto) and there have been many more Dexes doing incredible volume, listing fantastic stuff.
5. Advancements of privacy technologies
This may sound like a stretch here, but bear with me while I make this case : privacy, via cryptographic means, has been brought to the fore with blockchains. Zcash and Monero are the two most-known blockchains that offer real privacy when transacting. They use different cryptographic primitives such as Confidential transactions, pedersen commitments, bulletproofs and zero knowledge proofs (ZKP), and others.
As already covered in my previous post, zero knowledge proofs were mostly theoretical for close to 30-40 years, up until blockchains arrived and not only implemented but also advanced this field of ZKPs. I would recommend you read through that earlier post to see how much bigger the applications of ZKPs are - from currencies, to cybersecurity and verifiable identities.
I firmly believe ZKPs haven’t come onto their own yet, but will be a big part of daily interactions in the future. They will be a big deal.
Zero Knowledge Proofs - an overview (without the maths)
If you spend enough time in the cryptocurrency/digital assets space, you will hear the words “Zero Knowledge” mentioned quiet often. For most, the maths just block out all the maths and cryptography terminology when people talk about zero knowledge proofs.
In this short post, I will attempt to explain zero knowledge proofs as well as their many applications - all without the hard maths and cryptography formulae.
Zero Knowledge proofs isn’t new, it’s been around since the 1980s - although mostly theoretical, in recent years this field of study has taken a life of its own with the practical implementation in the blockchain space.
Cool, but what are Zero Knowledge proofs (zkp)?
It’s a protocol used for mathematically proving knowledge of something by one party (“Prover”) to another party (“Verifier”), in such a way that the proof doesn’t leak any more information.
Every Zero Knowledge proof contains the following properties:
Completeness - if the proof is true, the Prover can prove it repeatedly
Soundness - if the proof is false, it’s very hard for a Prover to claim it’s true to the Verifier
Zero Knowledge - if the proof is true, no other info about it is disclosed to the Verifier
In other words, one can prove knowledge of something, without disclosing that ‘something’ or providing clues that could lead to the knowledge of that ‘thing’, but done in such a way that the party you are proving to is satisfied of your knowledge of that ‘thing’. These Zero Knowledge proofs prove knowledge of something, not merely its existence i.e. with zkp, I can prove my age, not necessarily prove that age exists. This blogpost provides some illustrations of such examples, but for a more deeper dive into zkps, this matter Labs Awesome Zero Knowledge Proofs repo contains alot of information.
Given the number of years this field of study has had, we can be assured that the maths/cryptography is correct. With this assurance, we can forgo trying to fully understand the maths/cryptography and focus, instead, on the various applications and implementations of this novel cryptographic field of study.
An overview of some major/well-known ZKP systems include zk-STARK, zk-SNARK and bulletproofs (efficient range proofs):
zk-SNARK is an acronym for Zero-Knowledge Succinct Non-Interactive Argument of Knowledge. This has the following explanation:
Zero-knowledge: if the statement is true, a verifier does not learn anything beyond the fact that the statement is true.
Succinct: The size of the proof needs to be small enough to be verified in a few milliseconds.
Non-Interactive: Only one set of information is sent to the verifier for verification, therefore there is no back and forth communication between the prover and verifier.
ARgument: A computationally sound proof: soundness holds against a prover that leverages polynomial-time, i.e. bounded computation.
of Knowledge: The proof cannot be constructed without access to the witness (the private input needed to prove the statement)
zk-STARK is an acronym for Zero-Knowledge Scalable Transparent ARguments of Knowledge. zk-STARKs improve on the scalability of zk-SNARKs, but also removes the “trusted setup” which zk-SNARKs rely on. Trusted setup is the process of requiring a trust third-party to initially setup the ZK proof system. But this reliance on a third-party weakens the privacy properties of such a system. What zk-STARKs rely on instead, is to use publicly verifiable randomness to create trust-less, verifiable computation systems.
Bulletproofs are short non-interactive zero-knowledge proofs that require no trusted setup. Bulletproofs are a new type of more efficient range proofs (a range proof is basically cryptographic proof that a secret number is within a certain range e.g. with range proofs, I can prove that my age is between 19 and 25 years old, without expressly disclosing my age - just the range)
These ZKPs all differ with sizes of proofs, time it takes for proof computation, etc. These zkp implementations in blockchain protocols help with financial privacy, security size and speed of the values transacted on these protocols.
ZKPs in Blockchains
Ethereum, Zcash, Grin, Monero, Beam, Mina & Aleo, etc are just a few of the many blockchain protocols that utilize some form of ZKP.
Since there is already ample content about ZKPs in the blockchain ecosystem, we won’t dwell much on this application/implementation.
ZKPs in CyberSecurity
Vulnerability/exploit disclosures
An interesting development of ZKPs in CyberSecurity field is one spearheaded by DARPA (Defense Advanced Research Projects Agency) as part of their Securing Information for Encrypted Verification and Evaluation (SIEVE) program. Specifically, the focus is using ZKPs in vulnerability disclosures, such that a researcher can prove to a vendor that, not only does a particular vulnerability exists but that they have an exploit for such a vulnerability, all the while without revealing any further details about the vulnerability or exploit in question.
2 teams that took part in this DARPA challenge, Galois and Trail of Bits have already developed capability to mathematically prove exploitability of vulnerable software without revealing critical information. Trail of Bits have a more detailed walk-through of their thought process on this. Galois also have an overview of their approach to this in their Project Fromager.
Authentication (e.g username, password, MFA)
M-Pin is client-server protocol, which features two-factor client authentication as an alternative to Username/Password.
The basic idea is that a registered client is provided with a cryptographic secret, which is used to prove to a server for authentication purposes; all without ever disclosing the secret to the server. This means no information about that secret is ever stored on that server.
The cryptographic key is split into two factors for authentication : a user-select PIN and a token (stored in the Browser, for example). I recommend reading the paper linked above about this protocol.
There is already a live product that uses the M-Pin protocol, MIRACL Trust®, a cloud-based MFA platform that provides secure, multi-factor authentication to employees, partners, and external users without sending authentication credentials across the web for storage in the cloud.
Cyber Attribution?
The U.S Intelligence and Law Enforcement agencies have been public about some attributions, without always being in a position to disclose how knowledge of that information came about - it could be via sources, via compromising enemy infrastructure, etc. So it’s understandable why it would not be operationally wise to disclose that information.
We are required to trust and believe the unnamed methods and sources from these agencies relating to their attribution.
Although unsure of how the actual mechanics would work, on a theoretical level, this is another area which zero knowledge proofs would be beneficial for cyber attribution. Just as with how they have always done it, they could make a statement, and prove they have this knowledge, in such a manner that the public, and indeed the outed enemy, wouldn’t know exactly how they know, but would be assured that indeed that knowledge and proof is true.
We can leave it up to DARPA to fund another program to tackle this :-)
ZKPs in Decentralized Identity Standards
Microsoft and the Decentralized Identity Foundation, among others, have an initiative for Zero-Knowledge Proof scheme that enhances user privacy and security for digital credential systems.
The specific scenarios their zk-vc scheme is looking at are: Publishing a resume on a career networking app, checking the current status of work history credentials or interviewing for a new job.
I would highly recommend you read their paper, zero knowledge credentials with deferred revocation checks, on this if this is of interest to you.
Conclusion
Many of us, being non-cryptographers or mathematicians by training, tend to shut off our minds when reading/hearing about most things cryptography related (we only like the cryptocurrencies coz….hey, who doesn’t like money :-D ).
But I hope this short post has given you an overview of the various applications of zero knowledge proofs beyond just blockchains.
Current state of storing, restoring and bequeathing crypto assets / wallets
If you have been in crypto for a while, you will no doubt have heard the term “Not Your Keys, Not Your Coins”. It’s essentially a view, adopted mostly by “Bitcoiners”, that encourages users to hold their own private keys (for Bitcoin) instead of entrusting the keys to a third-party i.e. a Crypto Exchange.
It may seem folly now given the increasing number of custody solutions that now exist in crypto - many of which are insured (somewhat). Many have lost their Bitcoin/crypto due to exchanges being hacked or going down, etc - more-so back then than now, although even in recent memory there have been a few such cases.
The custody vs non-custody is something that I, like many, have no doubt spent some time thinking about. Estate planning is another major one for me. To everything, there is a pro and a con, including how to store your crypto.
Storing crypto keys is, fundamentally, a key management issue - which relies on ones own operational security for storing ones wealth. And if cyber security has taught us anything, it’s that passwords suck and people still suck at creating and/or storing passwords. This is despite important initiatives such as Password Managers (which everyone should use).
Placing such a huge responsibility on users to keep their keys safe is….no small feat. Quiet frankly, I’m of the opinion that most users should choose trusted custody solutions (ironic given that crypto is about “trustlessness” or more aptly - trust-minimization).
Current state of key management
I won’t focus on the cold vs warm storage of crypto in this post. I will focus rather on core issue of said storages, mainly the “non-custodial” key management component from the view of a normal (even advanced) crypto user/hodler/investor, etc.
There has been much improvement in the key management front over the past few years within the crypto ecosystem.
Vitalik wrote a much recommended post on the current state, and can be boiled down to:
Hardware wallets (private keys)
Mnemonic phrases
Mutlisig*
Social recovery
The problem with the first 2 is that you have to write down and safely store these private keys / mnemonic phrases. You create a crypto wallet - you are asked to write and safely store your mnemonic keys. Ditto when buying a hardware wallet.
One solution is ofcoz to store your mnemonic phrase / keys in a bank vault as many have done. And the hardware wallet while at it. This works.
On the social recovery, I won’t rehash what Vitalik has already written about (read it).
Multisig is another great solution, which I personally use for some wallets. There is gnosis, Unchained Capital Vaults, Casa and a bunch more.
The basic idea is that no one key can spend your funds. In the case of Unchained Cap depicted above, one has two options:
Client controlled - you control 2 of your keys and their control the third
Multi-institution - you only control 1 key, Unchained and another institution controls the other 2
I’m in favor of multi-institution, as opposed to client controlled (controlling 2 keys). But in either case, if you manage to loose your keys, Unchained/Casa, etc would be able to help you. It’s a sweet setup in that even if the companies go bankrupt or cease to exist, you can still recover. That is sweet!
Forget keys?
Multisigs are clearly the current better way to securely store your crypto, but what if there was an even more frictionless solution? We always talk about mass crypto adoption, and I don’t know if we can have that with the current friction of multisig solutions.
The way I see it, when the mass adoption happens, it will mostly be via custody solutions i.e. Exchanges, etc. And the security of these has improved over the last few years, with many being regulated (e.g. some in the United States). That’s just my opinion.
But more on the side of self-custody, what if you could have the security of multisig without private keys, mnemonic phrases? Wouldn’t that be even sweeter?! Well, such a solution does exist - enter ZenGo’s threshold sigs* (Note: I’m not shilling ZenGo, I use it too as with other discussed solutions. I just find their approach pleasant)
ZenGo wallet allows you the security of multisig (or rather, threshold sig*) and using the same workflows we are all accustomed to - email + cloud service + facial biometrics (FaceID, etc). One doesn’t have to bother with keys. Better still, they also have in place a solution to be able to recover your wallet should the business go under, similar to other entities that provide multisig.
This “keyless” security process is sweet. This is by far the most frictionless wallet that comes with as equally strong security guarantees. I’m excited to see more frictionless wallets in the market for the average user.
Estate planning
Death is inevitable (at least as far as I’m aware!). For most of us, the ideal situation is to leave our crypto assets for our loved ones when we pass away. Typically, one would have a last will and testament which stipulates what is to happen to your estate…..but with crypto, you could stipulate what is to happen, but how it is to happen is not so clear. At least not clear enough how to do it securely.
So here are a few ways I know of to date (please let me know if there are others I’m unaware of):
Requesting from Exchanges
The easiest solution is if one uses a custodial service (.e.g crypto exchange). In this case, alot of crypto exchanges have an overview of what needs to be provided to them to release said crypto assets belonging to your departed loved one. e.g. here is what Coinbase would require:
Death Certificate
Last Will and Testament - AND/OR - Probate Documents (either Probate, Letters Testamentary, Letters of Administration, Affidavit for Collection or Small Estate Affidavit)
Current, valid government-issued photo identification of the person(s) named in the Letters Issued
A letter signed by the person(s) named in the Probate Documents instructing Coinbase on what to do with the balance of the Coinbase account
Other crypto exchanges will have more or less similar requirements. Seems straight forward enough.
Password Manager Emergency Access
My favourite password manager, Lastpass, has an Emergency Access feature which is perfect for estate planning. It allows you to grant a one time access to your Lastpass Vault to a nominated person/s. This will give them access to everything in your vault (passwords, notes, credit card details, etc). It also allows you to set a timed delay for when your nominated person/s can access your vault. So you can set your Emergency Access and send to wife, husband, brother, sister, etc and set the time frame to be e.g. 6 months. The nominated person can request access, but while you are still alive, you can simply deny that request (Lastpass will alert you). If you don’t deny it, the nominated person still won’t have access until the time frame set (e.g. access will only be granted after 6 months, provided you haven’t denied the request)
So this is actually super cool and I wonder how many people are aware of this and have planned accordingly. So, this Emergency Access will work, in respective to crypto, for your loved ones to have access to your crypto exchanges and hopefully login and withdraw your assets. Or if you stored your private key / mnemonic phrases in your secure notes on Lastpass, then can simply use that to restore your non-custodial wallet and access your assets.
But why store your mnemonic phrases in Lastpass? What if it gets hacked?? Well, that threat is real, but other things you can do is enabled 2FA (obviously) but you can also choose to allow access to your Lastpass from specific IP geolocations (e.g. even if my Lastpass credentials are hacked and my 2FA device compromised, the attacker will only be able to login to my Lastpass from the U.S.A IP space. if they try login from China - access denied!)
Multisig inheritance
As discussed above, the multisig solution is quiet neat. In fact, one such multisig provider has such a solution - Casa Covenant, a Bitcoin inheritance solution. The catch is you must use one of their higher premium multisig packages (Diamond and Platinum). In a typical 3-of-4 multisig setup, Casa Covenant adds a sixth key - the Inheritance key, which can be given to your estate lawyer.
Unchained Capital, another good multisig provider, has a similar service as well and I have no doubt a few more will spring up in the coming years.
“Keyless” hackery i.e. the ZenGo + Lastpass way
Another “hackey way” is to use a combination of ZenGo and Lastpass. So I have already explained Lastpass Emergency Access, so that could be paired with ZenGo so that you don’t have to store keys / mnemonic phrases on Lastpass.
ZenGo, keyless and mnemonic-less custodial wallet, allows for the addition of a second Face map in your wallet (i.e Face map of your loved one)
Now, the caveat is the assumption of the device still being in your possession. If the device gets stolen, then we have to move to step two:
ZenGo allows for the backing up of the face maps to your chosen Cloud service. With the LastPass Emergency Access, your loved one, whose face map is added in ZenGo, will have access to 3 factors needed to restore your wallet - email account, the backup file and their face map.
Wait, what?
I did say this was a ‘hackey way’ to try recreate something which doesn’t require the storing of private keys / mnemonic phrases :-)
CryptoWill protocols (decentralised, on-chain wills)
There is research on cryptowill protocols - which are decentralised and on-chain self-sovereign cryptographic wills for bequeathing cryptocurrencies, without relying on any third parties. This protocol will ideally have:
updatability - as the owner of the cryptowill, you can update the will anytime before death
guaranteed access and privacy-preserving - beneficiaries should only learn of their entitlement after your death
robustness - if the protocol relies on mediators, it must be robust against any malicious mediator
The scheme and cryptographic setup of this protocol is laid out in the above mentioned paper. Give it a read if you can stomach the maths - but even if you can’t, I still reckon it’s worth a read!
Conclusion
I think I covered most of the things I wanted to cover in this post. As you can see, alot of ground work has been done, and more will be in the future. No doubt other things already exist which I don’t know or know but forgot to include in this post. We live in interesting times, and I’m looking forward to what is to come.
*this post won’t be going into the technical details of these concepts
Savings lottery for low-income earners
Roughly 7 months ago, I wrote a blogpost titled “The impact of community currencies in low-income communities”, in which, as the title suggests, I covered the real-world impact these paper-based, and crypto-based, community currencies have on low-income communities (I would recommend you read it if you haven’t).
For those who know my history, low-income communities hold a special place in my heart, having dedicated a better half of a decade working among such communities.
Another idea I have been masticating is of savings or no-loss lotteries to benefit these marginalised communities.
Lotteries are big business worldwide. What may come as a surprise to many is the target-market for said lotteries - the poor and low-income groups. Americans spend billions of dollars per annum on lotteries. But I don’t live in the United States (not yet??), so I will cover stats of an area I know best.
The following were reported by our National Lottery Board in 2011.
In South Africa:
poorer people play the lottery more regularly than those with higher incomes
73% of lottery players earn less than R5000 (~ $341 in todays exchange rate) a month
of those, 33% earn R1,000 (~ $68 in todays exchange rate) a month or less
71% sacrificed household necessities to play the lottery
These are shocking, yet somewhat expected, statistics.
I wanted to contrast the above stats with those of savings rate among the poor and low-income earners, - but with some making R1000 (~ $68) a month to live off, there is literally nothing to save!
<insert broken-heart emoji >
These stats are further compounded with South Africa’s 28.48% unemployment rate. But, this post is not about these depressing stats. Besides these, one other characteristic of lotteries is that players loose their money. There is a 1 in 13,983,816 chance of winning (in South Africa, at least).
There is a new (or old?) concept of a savings lottery/ lottery savings scheme/ no-loss lotteries. The idea is you save money, and those savings enable you to stand a chance of winning - it’s like playing the lottery except you don’t loose your money after playing. And when/if you win, you get the winnings plus all your savings are still there.
Such initiatives are great and encourage a savings culture among low-income earners. Unsure how well the uptake will be, but judging by how much is spent on lotteries, maybe with ample education and marketing these initiatives could pick up? Granted, perhaps the winnings won’t be as high as traditional lotteries and make you an instant millionaire - but we gotta start somewhere, right?
There are various explorations of this idea worldwide - transform low-income earners from the habit of gambling (lottery) to a form of saving.
There is even a blockchain-based no-loss lottery savings scheme - PoolTogether - which I personally love and use :-) . Anyone with an Ethereum wallet can save, in crypto or stable-coins/crypto-dollars, and stand a chance to win a few thousand dollars weekly. Your deposited crypto or stable-coins/crypto-dollars remain available to you at any point.
I don’t know if there are any such initiatives in South Africa (reach out if you know of any!), but this initiative would be one of many much needed services for low-income earners in the country. Maybe there is some research work on this that I’m not aware of (again, do reach out if there is).
Anyway, there are a few other thoughts/ideas I have been thinking about. I will put them “on paper” once I have digested them.
Software supply chain integrity
There is a deluge of supply chain hacks that could leave industry professional dispirited - from high-profile SolarWinds hack to other lesser known and lesser reported cases. In all honesty, there is a case for the dejection that could be felt by some in the industry; there is also a case for why attackers seem to be spending more focus on this attack vector.
Enterprises do not exist in a vacuum. There is only so much vertical integration one can do, before having to rely on some third-party software.
Our third-party security due diligence also hasn’t seemed to help in addressing these supply chain attacks. One is almost assured SolarWinds completed countless “third-party assurance” forms/documents.
Robust Application Security (AppSec) programs are certainly one way to try address this - but it’s not the whole picture. A process of “Shifting security left”, as its fondly referred, typically consists of doing security scans/steps early in the process and includes Static, Dynamic and Dependencies scans.
These sorts of programs have no doubt prevented a few nasties from materializing. But it’s doubtful this would have prevented the SolarWinds-type attack. This software assurance pipeline is great and detecting software vulnerabilities, but it’s doubtful this assurance pipeline could detect a backdoor (intentional, often covert vulnerability).
Almost everyone in security is aware of, and hopefully comfortable with, checksums. These are used to verify the integrity of downloaded software.
That SHA-256 provides assurance that the executable you downloaded hasn’t been tampered with in any way. This cryptographic assurance is important and ensures people download software intended by the vendor, as opposed to some malware.
Problem is, these checksums only cryptographically verify what’s downloaded/distributed - the end result. It’s better than nothing, but more could be done.
Enter intoto, a framework that does exactly that - defines a cryptographic layout of the steps in a software supply chain that are carried out in order to write, test, package and distribute software. It is essentially a checksum, but for the entire software supply chain, as opposed to only the end result.
A typical software supply chain is as portrayed above :
Code > Test > Build > Package > Distribute
What we have come to discover is that attackers can, and have, compromised any point of that software supply chain. If the compromise is at the build phase (as with the SolarWinds incident), a checksum of the package will only attest of the final version having integrity. As you can see, that assurance isn’t complete - especially in todays Cyber-ridden world.
Now, some solutions do exist for the various components of the software supply chain, as depicted above.
The afore mentioned AppSec pipelines are often included in the CI/CD component.
In-toto provides a layout of each step of the supply chain, including all the signing keys in the chain, as well as any other artifact (inputs & outputs) of every step, hash-chained and cryptographically signed. It’s like a recipe book that includes all ingredients needed to make a special dish.
It allows an end user to have assurance, via the cryptographically signed hash-chain, that the recipe is indeed prepared according to all the steps in recipe book - from the shop to your table.
I think this is a great step, which many of us should incorporate as our security programs mature. Supply chain attacks will be the death of us all for some time to come, but using frameworks like in-toto provides an additional layer of integrity in our software supply chain.
For critical software deployed in enterprises, could the industry start demanding this sort of software supply chain integrity? I would hope so.
EULAs, contracts, etc. would have to change though, and that may not be easy to do. Adding this sort of integrity in the software supply chain will certainly not stop all attacks (nothing can), but it does add an additional assurance layer. I think it’s a great, much needed step :-)
The in-toto project has a website helps create a basic layout for a software project, specifying who does what and how everything fits together, so that clients can be sure the software was produced exactly how you wanted it to be; check it out here !
Update (28-April): it appears there is a framework proposal, akin to OWASP ASVS, but for supply chain - Supply-chain Levels for Software Artifacts. Looks interesting and I intend to either contribute or keep my eyes on it. Good times!
How Eth2.0 mitigates specific PoS (Proof of Stake) attacks
In December 2020, ETH2.0 Beacon Chain launched. This Phase 0 launched forms part of a multi-year process that will see ETH1.x transform from a PoW (Proof of Work) to a PoS (Proof of Stake) blockchain.
This is an exciting period in Ethland and is the product of years of research of optimal solutions to ensure Ethereum remains secure and decentralized despite a change of consensus mechanism. its been a long held belief that PoW has the most security guarantees unlike, say, PoS.
I have previously covered some forms of attacks against Proof of Stake chains and so, with Ethereum moving to PoS, let’s see how it intends to defend against these PoS attacks or if any are applicable at all!
Nothing at stake attack - This attack relies on the assumption of "cheap" (almost nothing) mining on forks of the same PoS (Proof of Stake) chain. Multiple forks couldn’t be detected or discouraged.
Long range attacks - this attack occurs when an adversary creates a branch/fork on the blockchain starting from the Genesis block (or thousands of blocks in the past) and overtakes the main chain, thus rewriting history.
Eth2.0’s security model is making attacks extremely expensive by putting up economic value-at-loss i.e. security relies of penalties, not rewards.
Eth2.0:
solves 1) (nothing at stake) by making use of a punitive proof of stake algorithm where validator rewards are withheld if they sign blocks on competing forks. This was called Slasher and introduced by Vitalik in 2014.
We have indeed already seen a validator get slashed ~0.25 ETH:
solves 2) (long range attacks) by accepting “weak subjectivity”, which is one of the root causes of long range attacks. Weak subjectivity relates to new nodes and offline nodes that come online after a significant amount of time. These nodes would not be able to immediately distinguish which of the branches it received is the main chain. With Proof of Work, it's easy to determine the main chain as it is the one with the most proof of work, whereas in Proof of Steak, since there is no 'work' done, it's easier for such nodes to be deceived, at least for a time.
Eth2.0 Beacon Chain uses weak subjectivity checkpoints, which is a similar concept to “genesis block”, in that it’s a block that is agreed upon by the entire network as the “real” chain.
The Eth2.0 research work went a step further to determine a weak subjectivity period - which is defined as the number of recent epochs within which there must be a weak subjectivity checkpoint so that an attacker who takes control of the validator set at the beginning of the period is slashed at least a threshold amount in case a conflicting finalized checkpoint is produced.
The Ethereum re-engineering from PoW to PoS has been defined as "trying to change an airplane engine mid-flight”. It could be disastrous if it all goes wrong, but so far, all the research work and planning that went into this is moving along nicely.
I will be eagerly watching, and researching, this progress over the next 12-18 months!
Insurance taking center stage in Cyber - improving and maturing it?
Before tweeting the above, I spend some days trying to succinctly put into words what I was observing and thinking would take place in Information Security in the near future.
There is little doubt in my mind that ransomware is one of the biggest threats facing organizations and governments these days - the initial vectors of compromise vary from vulnerability exploitation to phishing (maldocs or cred stealing).
Organization Cyber Insurance
What I then started thinking of was one of the best ways to deal with this threat. Yes, we can update our systems and use 2FA and be cautious of unsolicited emails, etc but I kept coming back to insurance.
But talks of cyber insurance are nothing new.
In 2003, Pual Kurts, former Homeland Security Council and National Security Council Director, commented :
“The Insurance industry has a pivotal role to play in protecting our national infrastructure, particularly by developing cyber insurance policies”
I’m also of the view that Cyber Insurance will do more for Cyber Security maturity and resilience, than some things we have concerned ourselves with over the years due to cyber insurers’ minimum security requirements for coverage. It may feel or sound like a cop out to some in infosec, but it is what it is.
Some of the rough ideas in my mind are:
lower security maturity could mean higher insurance premiums, especially if the Org has had breaches in the past
higher security maturity could mean lesser insurance premiums (maybe)
Just as one type of control isn’t enough, I’m of the view that cyber insurance will be a norm.
Right now, perhaps it’s talked about in hushed voices in work corridors and behind closed doors in board rooms. But I fully expect it to be an open, standard thing in infosec.
Perhaps even with third-party supplier management (“ hey, we wanna do business with you, but what are your infosec policies, processes and governance? And do you have cyber cover?”) Or maybe not :-D
cyber insurance is and should be part of every Organizations arsenal for effective risk management
Cyber Insurance is the fastest growing line of business in the insurance industry and cyber insurance premiums have grown to an estimated USD 2 billion in North America and USD 3 billion globally and there are no signs of it slowing down.
We already know of a few public instances where an Organization had to pay ransom after systems where encrypted. In a few such cases, the cyber insurance actually paid :
University of Utah pays $457,000 to ransomware gang - cyber insurance pays part of the ransom
Garmin allegedly paid ransom although still unclear if their cyber insurers cough up
Over and beyond Organizational cyber insurance, one area of interest has been on the personal cyber insurance aspect.
Personal Cyber Insurance
More on the home front, with more and more home appliances connected to the internet, could we see a rise in personal/home cyber insurance?
We often joke about ones kettle getting ransomware and that spreading within your home network and preventing you from opening the fridge - but such situations, as laughable as they may be, may not be so far fetched.
A survey conducted by Swiss RE provides some interesting results:
56% of people would buy personal cyber insurance
63% preferred personal cyber insurance bundled with other services
37% preferred stand-alone cyber insurance
Personal cyber insurance may yet to be developed (or maybe it already is?). I would be very interested in seeing how it pans out.
Maybe companies will provide “personal cyber insurance” as part of employment perks ;-p.
UPDATE : I came across an awesome, data filed report on cyber insurance claims. Worth a read.
What got us here, won’t get us there - Passwords
There is no doubt passwords suck. Almost every week there are articles of “millions and billions of passwords being sold by hackers”.
Continued use and mismanagement of passwords enable and finance cyber criminal marketplaces.
What makes passwords suck even more is the outdated advise most security professionals still give (in 2020!):
“make your password complex (1 Uppercase, 1 Lower case, 1 Special Char,etc)”
change your passwords every 60 - 90 days to stay secure
Security professionals still give this advise despite recommended password best practice from NIST (National Institute of Standards and Technology). Yes, NIST recommends that passwords NOT be changed often unless there is evidence of compromise; they also advise the discontinued use of password complexity.
Passwords clearly need to go. Hopefully more and more organisations already have or at least beginning to draft a road map of doing away with outdated password security practices, enabling MFA, or even better, drafting a road map for phasing out passwords where possible.
So, what are some of the exciting passwordless innovations taking place?
Microsoft
Microsoft have awesome documentation and demos of going passwordless, focusing primarily on:
Windows Hello for Business
Microsoft Authenticator App
FIDO2 Security Keys
Their passwordless authentication options for Azure Active Directory documentation is awesome and worth checking out.
Auth0
Auth0 is also doing wonders in this push, offering the following passwordless factors:
Email
Magic Link
SMS
Auth0’s documentation has awesome explanations and walk throughs.
Magic
Magic is also another player in the passwordless arena, albeit predominantly focused on the Blockchain industry. They do away with passwords and stick with email magic links (akin to Auth0).
This reduces the friction even upon sign-ups.
How sign-ups are currently done:
How magic does sign-ups:
I found Magics documentation to also be awesome and worth reading.
I’m certain there are many more companies working on this, but I only listed those I use often.
We live in interesting and exciting times, and I often wish I was a developer when I think of all the great things that can, and have, been done in recent years.
Less Friction. More Security. No passwords.
That’s the future I look forward to.
The impact of community currencies in low-income communities
In 2012, I was part of a Youth Innovation Showcase where I presented an idea of a platform called Commune. One of the 3 focus areas of the platform was on stimulating local economies via cash mobs:
Needless to say, that idea and platform didn’t have an opportunity to fully materialize as I had hoped. I was bitten by the Cyber Security bug and developed new interests.
Since then, one idea had unexpectedly captured my attention - community currencies. Till then, I had only known the 180 world currencies as recognized by the United Nations.
But reading the works of Bernard Lietaer and others like Thomas H. Greco spoke volumes on this. I particularly liked Lietaers’ focus on local economic development with these complementary community currencies.
These “slum economies” or community currencies are complementary mediums of exchange to a national currency and serve a particular purpose - stimulating local economies during times of business downturns and help provide for basic needs when the national currency is in short supply.
In his 2001 paper, The Economics of Community Currencies: A Theoretical Perspective, Shraven sates: “when the national money fails to facilitate all potential exchanges of a sub-set of the economy that has strong economic interconnections, a complementary currency can alleviate this problem”
By then, I was already aware of Bitcoin, but it wasn’t until much later that I began to think of Bitcoin as a form of community currency (more on this in a follow up article).
In The Future of Money, Lietaers provides a few examples of such complementary community currencies:
Time Dollars - invented by a prominent Washington lawyer and applied in several hundred communities in the US
Ithaca HOURS - a paper currency launched by a community activist in the small university town of Ithaca, New York. Ithaca is a relatively low-income community of about 27,000 inhabitants
Tlaloc - a popular Mexican neighborhood currency
Bia Kud Chum - the first South-East Asian community currency
WIR - an independent complementary currency system in Switzerland that serves businesses in hospitality, construction, manufacturing, retail and professional services.
Closer to home, there is one few have heard of Bangla-Pesa - a colorful paper-based complementary community currency used in the slum of Bangladesh in Mombasa, Kenya. It is a voucher that traders and service providers use as a medium of exchange.
Benefits of community currencies
There exists ample research on the impact of community currencies and how they contribute to sustainable livelihood in poor and informal settlements. It’s worth noting, however, that some community currencies like ithaca Hours, lack clear evidence of economic advantage.
But, some community currencies do show clear evidence of economic advantage.
Below are some examples of these economic benefits of community currencies like Bangla-Pesa:
Daily purchases in Eco-Pesa community currency allow members to save money in their national currency which they would have spent otherwise (these community currencies function as mediums of exchange and are not meant to be stores of value)
83% of business participants reportedly saw an increase in total sales, attributed to Bangla-Pesa
Members have a significantly higher food consumption than non-members, as well as a generally higher food budget
The Bangla-Pesa community currency accounted for 22% increase in monthly income
The Liquidity Problem & Blockchain Solution
There are, however, a few problems with community currencies :
Value generated within the community stays within the community. It cannot be spent within another community
Lack of trust
Scaling and design constraints (typically can’t scale beyond 100 - 200 businesses in the network)
These, among other issues, have been addressed to a degree by incorporating Distributed Ledger Technologies.
A recent paper on this issue stated: of the over 4,000 complementary currencies that have sprung up in 50 countries, most complementary currencies fail to create sustainable monetary alternatives due to low liquidity.
Liquidity is defined as the probability that an asset can be converted into an expected amount of value within an expected amount of time.
Blockchains can provide transparency, audit-ability and liquidity reserves with built-in convertibility between community currencies and promote greater usage and collaboration between communities. Users are empowered to exchange the community currencies for goods and services outside of the originating community as well as between national currencies.
A case in point is the Sarafu Network. Sarafu is a digital community currency and can be used akin to M-Pesa - the renowned Safaricom product.
Documentation of the Sarafu Network can be found on their public Gitlab where they cover technical details such as their Smart Contracts, Blockchain choice (POA, xDAI), Wallets, Fiat on-ramps and off-ramps, etc.
Blockchain-based Community currencies
Grassroots Economics, an organization that spear-headed these community currencies in Africa, found that within a 6 months pilot of the Sarafu network:
4,065 Kenyans representing families living below the poverty line traded about $30,000 with each other
Linking 9 community currencies in circulation among 1,136 businesses, clinics and schools
2,567 daily farming wages had been paid
54,928 servings of vegetables
5,361 kilos of flour
2,506 rides on local transport
843 school tuition payments
484,404 liters of water
59 visits to the doctor
Mind-blown!
These aren’t just numbers, these represent healthcare, education, and sustenance for hundreds of families living below the poverty line, without depending on the government issued national currency.
Conclusion
I’m slightly saddened that during my work among marginalized communities years ago, I wasn’t aware of community currencies and focused on using national currencies. Maybe I could have done more. But i’m also over-joyed at the results I have seen over the years of tracking the impact of these community currencies.
Given the above data about the effects of community currencies, there is a case to be made for experimenting and deploying these in informal settlements around the globe. A few initiatives have already embarked on this journey and I hope we will see many more in the coming years.
I’m rooting for the success of all these projects!!
Static analysis lessons from big tech co’s
In 2018 and 2019, Google and Facebook released their research outputs of lessons learnt from scaling static code analysis in their respective companies. Their articles were titled Scaling Static Analyses at Facebook and Lessons from Building Static Analysis Tools at Google, respectively
These companies build software that is used by hundreds of millions of people daily, with code bases that are hundreds of thousands if not millions lines of code. The tools mentioned and used in these respective companies may not yet be Open Sourced, but the lessons shared can be applicable to most companies building their own software.
Given my current role of working with multiple engineering teams with multiple products, a few points in these Google and Facebook articles occasionally resurface in my mind every now and then. These aren’t ground breaking ideas.
Below are a few points that continue to be impressed upon me even after months of having last read those articles:
Static analysis tools can be powerful
Static code scanning is great but can be annoying when it comes to high number of false positives. These high false positive are often justification for inaction by engineering teams - and rightly so.
Work on reducing false positive cases before they are surfaced to engineering teams. For teams that have highly configured and tested static analysis tools, these can prove to be powerful enablers of security with engineering teams.
Focus on developer teams and their tool/workflow integrations
This means that static scan dashboards and outputs must feed into engineering workflows instead of requiring engineers to go out of their way to login into your separate bug dashboard/system.
Know & focus on bugs that matter
Not all bugs matter. Enable and empower engineers to address bugs that matter by using data from production to determine which those are (e.g. exploitable bugs from pentests and bug bounties). Not every bug is important enough to address as a matter of urgency.
Mental effort of context switch
If a developer is working on one problem, and they are confronted with a report on a separate problem, they must then swap out the mental context of the first problem and swap in the second, and this can be time consuming and disruptive. Try to minimize this.
Report often, report early
Report bugs to developer teams early to get optimum fix rate. In one of the articles, the engineering teams, responding to a query from the security team, reported that issues found and flagged during compile time were deemed to be more important issues compared to issues found in checked-in code.
The lessons learnt from big tech companies who have more mature processes, tools and experiences make for good reading for almost everyone in security and developers/engineering teams themselves.
Archiving parts of my site on the Blockchain
The internet never forgets (until it has to)
I love blockchain technology in general. I’m an investor in, have written about, and am a user of, various blockchain platforms. I have also contributed by disclosing bugs in various blockchains.
I have used a few document storage blockchain projects like Storj, IPFS, etc a little under 4 years ago.
Now that I have my own site, I decided to look at these protocols again. I recalled a protocol that was on my radar called Arweave, a data storage blockchain protocol. I found it to be the easiest to use for what I wanted. I had read their whitepaper ages ago, but hadn’t really used their tech.
Today was the perfect opportunity to use it.
Arweave has a concept they call “permaweb”, built on top of the Arweave network.
The permaweb is a collection of interlinked documents and applications that get stored permanently on their protocol (so you may want to be careful what you put there).
I decided to put some sections of my site on the permaweb. Ironically, I opted to archive all the articles I wrote about blockchain technology - on the blockchain (technically “blockweave”)!
E.g. this paper - Future tourism trends: Utilizing non-fungible tokens to aid wildlife conservation is no longer pointing to the AJHTL Journal website. It is now pointing to, and archived on, the Arweave permaweb.
Archived paper: https://sqwgu2q52ul3lozh67aqok6l4sdqj4r455l2esuiktdt5xkfkb5q.arweave.net/lCxqah3VF7W7J_fBByvL5IcE8jzvV6JKiFTHPt1FUHs
View raw transaction : https://arweave.net/tx/lCxqah3VF7W7J_fBByvL5IcE8jzvV6JKiFTHPt1FUHs
View in block explorer: https://viewblock.io/arweave/tx/lCxqah3VF7W7J_fBByvL5IcE8jzvV6JKiFTHPt1FUHs
Why did I do it?
For fun. And because I like keeping my finger on the pulse of what’s happening in this industry.
In due time, I will migrate this entire site to be hosted, in parallel, on one of these blockchain protocols. Again, just for fun :-)
UPDATE:
I have pinned my 2017 article from Pentest Magazine on IPFS. The article is a tutorial on reverse engineering, exploit writing and micropatching - A new initiative of micropatching - 0patch!
This article on the impact of community currencies in low-income communities is also archived on the Arweave permaweb
Welcome to my new home!
We can all agree on one thing - 2020 has been a weird year. We are all embracing the “new normal” which was thrust upon us.
This “new normal” allowed me plenty of time to contemplate changes & challenges I needed to set for myself - chief among which were to setup a personal site to house my thoughts, professional achievements and occasional cybersecurity related posts (my being de-platformed from LinkedIn and having all the articles and posts I had put up on that platform for years left a sour taste in my mouth).
I read alot. I often try to share summaries of what I read with friends and try (& fail) to post long forms on Twitter. I think this personal site will be a better platform for this. Though I work in CyberSecurity, my readings and interests vary, and so will my occasional posts and book reviews.
I hope you will visit, if only occasionally, to get a glimpse of what I get up to.
Cheers.
TK a.k.a Peter a.k.a Dmitri Kaslov