(Verifying) Zero-knowledge is the Real Endgame

Introduction

Zero-knowledge proofs, first introduced by Goldwasser, Micali, and Rackoff in the late 1980s, have revolutionized cryptography and found profound applications within blockchain.

Zero-knowledge proofs provide a way to verify the truth of a statement without revealing the underlying data. This concept is particularly useful in blockchain technology, especially for zero-knowledge rollups. Zero-knowledge rollups bundle many transactions together into a single proof that can be verified by anyone. This process significantly reduces the amount of data that needs to be processed and stored on the blockchain, improving efficiency.

However, the security of ZKPs depends heavily on both the prover (who generates the proof) and the verifier (who checks the proof). If there are bugs or vulnerabilities in the prover or verifier, the whole system can be compromised.

This research emphasizes the importance of verifying the verifier to ensure the integrity of zero-knowledge systems and why Lumoz, with their AVS computational layer and zkRaaS offering, has taken a significant step forward in addressing this challenge with the launch of zkProver and zkVerifier based on EigenLayer.


What are ZKPs?

A ZKP protocol allows a prover to convince a verifier of the validity of a statement without revealing any information beyond the truth of the statement itself. This property of "zero-knowledge" is achieved through intricate mathematical constructions that enable the prover to generate a proof that the verifier can verify without gaining any additional knowledge about the prover's secret information.

ZKPs utilize cryptographic algorithms such as zk-SNARKs (Succinct Non-interactive Arguments of Knowledge) and zk-STARKs (Scalable Transparent Arguments of Knowledge) to achieve these properties.

ZKPs enable the verification of assertions without revealing the underlying data, enhancing privacy and enabling more efficient blockchain operations. As the adoption of ZKPs grows, the importance of reliable, secure, and efficient ZKP verification mechanisms becomes increasingly critical as any vulnerability or flaw in the verification mechanism could undermine the trust and integrity of the entire network.

The above diagram explains how a zero-knowledge proof can be generated and verified, ensuring the validity of the statement without revealing any additional information.

What Is a Verifier and Why Do We Need it in the First Place?

As mentioned earlier, in a ZKP system, the prover generates proof that a certain statement is true without revealing any underlying information, and the verifier checks the validity of this proof.

Just as a compromised security checkpoint can jeopardize the safety of an entire airport, a flawed verifier can endanger the security of the entire blockchain network

The integrity and security of ZKP systems hinge critically on the correctness of the verifier. A verifier that operates correctly ensures that only valid proofs are accepted, thereby maintaining the trust and reliability of the system. Conversely, a verifier with bugs or vulnerabilities can lead to catastrophic security failures, allowing invalid proofs to be accepted and potentially compromising the entire blockchain network.

With the growing trend around zero-knowledge proofs and their adoptions, we must emphasize the necessity of verifying the verifier to ensure the long-term security and reliability of ZKP systems.

We can argue here that while highly optimized ZK provers can provide an effective alternative to traditional multi-execution models in blockchains, the correctness of the verifier is paramount. Implementation bugs in the verifier can easily compromise the security of the overall system, making formal verification of the verifier an indispensable aspect of ZKP deployment.

ZK Is Where Math Meets Practicality

While zero-knowledge proofs are based on mathematical principles, their practical implementation involves complex code. This code often has bugs, which can undermine the security of the system. There are three main types of bugs:

  1. Buggy Circuits: In zero-knowledge proof systems, the relations that need to be proven are specified using arithmetic circuits. These circuits can have bugs that affect the correctness of the proofs. For example, if the circuit is not correctly implemented, it might not properly represent the relation it is supposed to prove.

  2. Buggy or Malicious Provers: The prover is responsible for generating the proof. Bugs in the prover can lead to incorrect proofs being generated. Additionally, if an attacker can modify the prover's code, they might be able to create forged proofs that falsely convince the verifier that an incorrect statement is true. An example of such a bug is the "Frozen Heart" vulnerability, which stems from improper implementations of the Fiat-Shamir transformation—a method used to convert an interactive proof into a non-interactive one by generating challenges from a hash of the protocol transcript.

  3. Buggy Verifiers: The verifier checks the validity of the proof. Bugs in the verifier can be even more problematic because they can allow forged proofs to be accepted. If the verifier mistakenly accepts an incorrect proof, it compromises the security of the entire system.

Hence, the verifier’s correctness is fundamental to the core properties of ZKP systems:

  1. Completeness: This ensures that if the prover and verifier follow the protocol correctly, any valid proof will be accepted by the verifier.

  2. Soundness: This guarantees that if a statement is false, no cheating prover can convince the verifier to accept an invalid proof.

  3. Zero-Knowledge: This ensures that the verifier gains no knowledge beyond the validity of the statement being proven.

Verifying the verifier involves a rigorous process of formal verification, which uses mathematical methods to prove that the verifier’s implementation adheres strictly to its intended specifications and security properties. This process is essential because the verifier is the ultimate arbiter of proof validity in a ZKP system. Any flaw or vulnerability in the verifier could be exploited to accept invalid or fraudulent proofs, thereby undermining the security and trustworthiness of the entire system.

Some Real Life Examples

1. Critical Vulnerability in Linea Code Audit: Incorrect Randomness Computation Allows Proof Forgery

During a recent audit of the Linea blockchain's verifier contracts on November 2023 (PlonkVerifierFull.sol and PlonkVerifierFullLarge.sol), a critical vulnerability was discovered. Linea, a ZK-rollup on Ethereum, relies on these verifiers to validate state transitions using ZKPs.

The vulnerability was found in how the verifier generated a random value u. This value is essential for validating proofs. However, the flawed implementation did not properly randomize u and failed to incorporate certain proof components ([Wζ]1 and [Wζω]1). This allowed attackers to predict and manipulate “u”, enabling them to create fake proofs that could be falsely verified as valid.

How the Attack Works

  1. Extract Values: The attacker extracts certain values from a valid proof.

  2. Manipulate Public Input: They set the public input to a chosen value and compute a modified value to pass checks.

  3. Alter Proof Components: The attacker adjusts proof components ([Wζ]1 and [Wζω]1) using the predictable u.

  4. Submit Fake Proof: The manipulated proof passes verification due to the flawed random value generation.

This vulnerability could potentially allow attackers to alter the blockchain state and steal assets.

The issue was resolved by updating the code to generate the random value u correctly, ensuring it is derived securely from the entire proof transcript, making it unpredictable and preventing forgery.

This audit highlighted a critical vulnerability in the Linea verifier's random value generation, classifying it under "Buggy Verifiers." The fix involved ensuring proper random value generation to maintain the security and integrity of the ZKPs used by the Linea blockchain. This example underscores the necessity of rigorous verification processes and formal verification methods to prevent such vulnerabilities and enhance the security of zero-knowledge proof systems.

2. Critical Vulnerability in zkSync Code Audit: Incorrect Deposit Cap Management

An audit of the Matter Labs zkSync 2.0 contracts repository, specifically at the 3f345ce commit, identified a critical vulnerability. The audit covered various contracts, including “AddressAliasHelper.sol” and “L2ContractHelper.sol”. zkSync Era is a permissionless ZK-rollup enabling interaction with Turing-complete smart contracts executed on a zkEVM.

A critical issue was found in how the deposit cap for user deposits was managed. The system increases the totalDepositedAmountPerUser counter on deposits but does not decrease it on withdrawals. This oversight can lead to users being locked out from making new deposits once they hit the cap, even if they have withdrawn funds.

How the Issue Arises

  1. Initial Deposit: A user deposits funds, increasing their totalDepositedAmountPerUser.

  2. Withdrawal: The user withdraws funds, but the counter remains unchanged.

  3. Lockout: As the counter only increases and never decreases, users can quickly reach their deposit cap, preventing further deposits.

This vulnerability could lock users out of depositing again after a withdrawal, causing significant usability issues. For example, if a user deposits an amount close to the cap and then withdraws it, they could be immediately locked out from making new deposits.

The Matter Labs team acknowledged the issue but decided not to resolve it immediately, stating that the deposit limitations are only temporary during the Fair Onboarding Alpha phase and will be removed in the Full Launch Alpha. This decision classifies the issue under "accepted risk."

What Does This Tell Us?

These examples from Linea and zkSync provide important insights into the complexities and challenges of maintaining secure and reliable ZKP systems.

Verifier Correctness: The correctness of the verifier is paramount in any ZKP system. Both Linea and zkSync encountered significant issues due to flaws in their verifiers. For Linea, incorrect sampling of random values allowed proof forgery, while zkSync faced potential user lockout due to improper state management. Ensuring that verifiers operate correctly is essential for maintaining the trust and reliability of ZKP systems. This involves rigorous testing, formal verification, and continuous monitoring to identify and rectify vulnerabilities. By ensuring verifier correctness, zkVerifier plays a critical role in preventing security breaches and maintaining system integrity.

Secure Random Value Generation: The Linea example underscores the necessity of using secure and unpredictable methods for random value generation in cryptographic protocols. Randomness is a cornerstone of ZKP security. In Linea's case, the predictable randomness allowed attackers to forge proofs, compromising the system. The resolution involved aligning the random value generation with cryptographic best practices, ensuring the unpredictability required to maintain system integrity. zkVerifier ensures that random values are generated securely, preventing similar vulnerabilities.

Accurate State Management: The zkSync example highlights the importance of precise state management in smart contracts. zkSync faced issues because the system did not correctly update the totalDepositedAmountPerUser counter upon withdrawals, potentially causing user lockout. Proper state management ensures that user actions, such as deposits and withdrawals, are accurately tracked and managed, preventing disruptions and maintaining the system's usability. zkVerifier ensures thorough testing and validation of state changes to maintain consistency and correctness, preventing such issues.

zkVerifier addresses these needs, playing a vital role in preventing vulnerabilities and ensuring the integrity of ZKP implementations.

Lumoz’s Reach: zkProver and zkVerifier

Lumoz, a leader in modular computing layers and zkRaaS has taken a significant step forward in addressing this challenge with the launch of an AVS computation layer based on EigenLayer.

This new layer is composed of zkProver and zkVerifier, two critical components that significantly enhance computational power and security.

  • zkProver: This component focuses on generating ZKP. The zkProver can efficiently generate proofs, leveraging substantial computational resources.

  • zkVerifier: By integrating with EigenLayer's Re-staking mechanism, zkVerifier not only utilizes Ethereum's security but also offers additional economic incentives for validators. This dual verification mechanism greatly bolsters network security and reduces trust risks.

zkProver leverages advanced polynomial commitment schemes to efficiently generate proofs, while zkVerifier uses parallelized Groth16 verification to ensure rapid and secure proof validation.

The launch of zkVerifier comes at a critical juncture as the industry seeks to overcome the limitations of traditional verification methods and embrace a more robust and trustless solution. With the increasing adoption of ZKPs across various domains, from privacy-preserving transactions to secure multi-party computation (MPC), the need for a reliable and efficient verification infrastructure has become more evident than ever.


Lumoz’s node sale is live. Invite code: SCKOX


Modular Computational Layer

Modular rollups present an innovative solution to the blockchain trilemma by dividing the blockchain into distinct layers, improving efficiency in transaction processing and data management.

  • Settlement Layer: Operating on L1, it updates statuses and verifies data to ensure accuracy.

  • Execution Layer: Processes transactions on the rollup, recording and updating transaction statuses quickly without waiting for main chain confirmation.

  • Consensus Layer: Ensures consensus and security using efficient algorithms like Proof of Stake or Proof of Authority.

  • Data Availability Layer: Records all transaction data on the rollup and ensures data traceability and integrity when updating asset statuses.

For ZK-Rollups, an additional core module, the Prover Layer, is essential.

The Lumoz computational layer serves as the core computational module with the following features:

  1. Decentralized Computing: Utilizing a hybrid consensus mechanism of Proof of Stake (POS) and Proof of Work (POW), it ensures decentralized computing power, supporting continuous ZKP capacity for zkRollups and secure data processing for AI applications.

  2. Robust Computational Stability: With thousands of GPU/CPU nodes, Lumoz ensures continuous computational capacity for ZKP and ZKFP, supporting large-scale parallel computing for AI tasks.

  3. Wide Compatibility: Compatible with major Rollup solutions like Polygon zkEVM, zkSync, Scroll, and Starknet, it also supports various AI computational needs.

  4. Cost-Efficiency: Combining a comprehensive economic model, Lumoz provides low-cost ZKP generation and economical AI computing solutions.

The Lumoz Network leverages deep research and continuous innovation in ZK computing, reducing costs and participation barriers. Users can join the network with flexible options:

  1. zkVerifier Node: Validates the computation processes and results of zkProver Nodes, ensuring accuracy and network reliability.

  2. zkProver Node: Acts as the computational engine, executing tasks and generating corresponding ZKPs or ZKFPs, ensuring the privacy and security of information.

Through these roles, the Lumoz Network offers a cutting-edge, secure, and efficient computing environment for blockchain and AI applications.


Lumoz’s node sale is live. Invite code: SCKOX


Integration of zkVerifier with Ethereum

zkVerifier seamlessly integrates with the broader blockchain ecosystem, particularly Ethereum, through a robust mechanism for publishing verification results directly to the Ethereum blockchain. Once zkVerifier nodes independently verify proofs from zkProver, these results are transmitted to Ethereum. There, Ethereum generates verification proofs, leveraging its security infrastructure to confirm zkVerifier's results.

These Ethereum-generated verification proofs provide the final authoritative confirmation of data validity, ensuring the accuracy and reliability of zkVerifier's data. This critical validation step maintains the integrity and trustworthiness of the verification process across different blockchain platforms.

The deep integration with Ethereum enhances cross-chain interoperability, allowing zkVerifier to interact with various blockchain networks relying on Ethereum for security and validation (Mainly with DA and Economic Security, i.e, AVSs). This interoperability is crucial for diverse blockchain applications to collaborate and operate securely.

Additionally, zkVerifier benefits from Ethereum's robust, decentralized, and tamper-resistant infrastructure, protecting the verification process against potential attacks and vulnerabilities. This integration significantly enhances zkVerifier's overall security and reliability, making it a trustworthy component within the blockchain ecosystem.

Through this visualization, I have tried to capture the distributed nature of zkVerifier, illustrating how multiple independent verification nodes ensure the authority and consistency of verification results through a robust consensus mechanism.

Integration of Proofs from Multiple Sources

zkVerifier integrates proofs from a diverse array of zero-knowledge proof (ZKP) systems, supporting a wide range of applications within the blockchain ecosystem. This integration is achieved through a modular architecture that allows zkVerifier to process and validate proofs generated by different proof systems, such as zk-SNARKs, zk-STARKs, Bulletproofs, etc. Each of these systems has unique characteristics and requirements, but zkVerifier is equipped to handle them seamlessly.

By supporting multiple proof systems, zkVerifier enhances the flexibility and robustness of the blockchain ecosystem. It can validate proofs for various applications, including privacy-preserving transactions, identity verification, and secure computations, without being limited to a single ZKP framework. This flexibility allows developers to choose the most suitable ZKP system for their specific use case, knowing that zkVerifier can accommodate their choice and ensure the integrity of the proofs.

Proof Processing and Verification Mechanisms

One of the standout features of zkVerifier is its meticulously designed proof processing and verification mechanisms. These mechanisms are optimized to reduce the gas costs associated with proof submission, making the process more cost-effective for users. Gas costs can be significant, particularly for complex transactions involving cryptographic proofs. zkVerifier addresses this challenge through several key optimizations:

  1. Parallel Computing: zkVerifier employs parallel computing techniques to handle multiple proofs simultaneously. This reduces the time required for verification and spreads the computational load across multiple processors, improving overall efficiency.

  2. Caching Mechanisms: To minimize redundant computations, zkVerifier uses caching mechanisms that store intermediate results and frequently accessed data. This reduces the number of operations that need to be performed repeatedly, lowering the computational overhead and associated gas costs.

  3. Efficient Data Structures: The use of advanced data structures allows zkVerifier to organize and access data more efficiently. This contributes to faster processing times and reduces the gas required for data retrieval and manipulation during the verification process.

By implementing these optimizations, zkVerifier ensures that the cost of submitting and verifying proofs is significantly reduced, providing a more affordable and accessible solution for users and developers.

Adaptability to Different Proof Characteristics

zkVerifier is also designed to be highly adaptable to the characteristics of proofs generated by different ZKP systems. This adaptability is crucial for ensuring efficient system operation, as different proof systems can vary widely in terms of proof size, verification time, and verification logic. zkVerifier achieves this adaptability through several means:

  1. Dynamic Resource Allocation: zkVerifier dynamically allocates computational resources based on the specific requirements of each proof. For instance, larger proofs or those requiring more complex verification logic are assigned more resources to ensure timely processing.

  2. Scalable Verification Logic: The verification logic within zkVerifier is scalable, allowing it to adjust the depth and breadth of verification checks based on the complexity of the proof. This ensures that each proof is verified thoroughly without unnecessary delays.

  3. Proof Size Management: zkVerifier is capable of handling proofs of varying sizes efficiently. It uses optimized algorithms to manage memory and processing power, ensuring that even large proofs are processed without causing bottlenecks or excessive resource consumption.

By being adaptable to different proof characteristics, zkVerifier ensures that it can efficiently verify a wide range of proofs, maintaining high performance and reliability across diverse applications.

Leveraging a Distributed and Decentralized Architecture with zkVerifier

Distributed Verification Nodes: zkVerifier leverages a distributed and decentralized architecture by deploying multiple dedicated verification nodes that operate independently. These nodes are strategically distributed across the network, ensuring that verification tasks are not centralized in a single location. Each node performs verification independently, adding a layer of robustness and reliability to the verification process. This distributed approach ensures that the verification results are not reliant on a single entity, thereby enhancing the overall security and trustworthiness of the system.

Collective Decision-Making Process: One of the key strengths of zkVerifier is its collective decision-making process. To validate a proof, zkVerifier requires that at least two-thirds of the verification nodes must confirm the proof's validity. This threshold ensures a high level of consensus among the nodes, making it exceedingly difficult for fraudulent proofs to be accepted. The requirement for a supermajority agreement mitigates the risk of collusion or manipulation by a minority of nodes, thereby enhancing the security and reliability of the verification process. This consensus mechanism is pivotal in maintaining the integrity of the system and ensuring that only valid proofs are accepted.

Alignment with Blockchain Principles: The decentralized architecture of zkVerifier aligns perfectly with the core principles of blockchain technology, which include trust, transparency, and resilience against single points of failure. By distributing the verification tasks across multiple nodes, zkVerifier ensures that no single node has undue influence over the verification process. This decentralization promotes transparency, as the verification process is open and distributed, making it more difficult for any single actor to manipulate the results.

Moreover, the resilience of the system is significantly enhanced, as the failure or compromise of a single node does not affect the overall verification process. This distributed nature makes zkVerifier robust against attacks and failures, ensuring continuous and reliable operation. By upholding these fundamental blockchain principles, zkVerifier not only enhances the security and reliability of ZKP verification but also fosters a more trustworthy and transparent blockchain ecosystem.

Customized Release Strategies

To optimize the use of on-chain resources and ensure efficient proof transmission, zkVerifier employs customized release strategies. These strategies are tailored to the specific characteristics of the proofs and the requirements of the blockchain network. The main goals of these strategies are to reduce network congestion, improve transaction speed, and ensure the reliability of proof submissions.

  1. Batch Processing: zkVerifier can aggregate multiple proofs into a single batch before submission to the blockchain. This reduces the number of individual transactions and associated overhead, leading to lower gas costs and faster processing times.

  2. Prioritized Proof Handling: zkVerifier prioritizes proofs based on their urgency and importance. Critical proofs are processed and submitted more quickly, ensuring that high-priority transactions are confirmed without delay.

  3. Optimized Data Availability Layer: The data availability layer within zkVerifier ensures that proofs are accessible and durable, even under high network load. This layer uses efficient storage strategies to keep proof data readily available for verification without causing network congestion.

  4. Adaptive Proof Release Timing: zkVerifier adjusts the timing of proof releases based on current network conditions. During periods of high congestion, proofs may be delayed slightly to avoid adding to the load, while during low congestion periods, proofs are submitted more aggressively to maximize throughput.

These customized release strategies ensure that zkVerifier not only processes and verifies proofs efficiently but also transmits them in a way that optimizes the use of on-chain resources, reduces network congestion, and enhances overall transaction speed.

Enhancing ZKP Applications with Lumoz's Advanced Computational Layer

Lumoz’s computational layer, encompassing both zkProver and zkVerifier, stands to significantly influence the blockchain landscape by drawing in developers and projects eager to build ZKP-based applications. Its efficient and secure infrastructure, coupled with integration into EigenLayer's Actively Validated Services (AVS), provides a robust foundation for fostering innovation and expanding the ZKP ecosystem.

With Lumoz’s advanced computational layer, developers have access to highly efficient and secure mechanisms for their applications. By simplifying the generation and verification of ZKPs, and reducing associated costs, Lumoz lowers the entry barriers for developers. This accessibility encourages a diverse range of projects to harness ZKPs, creating a vibrant ecosystem of innovative applications. The Lumoz platform, powered by zkProver and zkVerifier, becomes an attractive environment for developers aiming to create cutting-edge blockchain solutions.

The deep integration with Ethereum not only ensures the accuracy of verification proofs but also facilitates seamless cross-chain interoperability. By publishing verification results to Ethereum, Lumoz can interact with various other blockchain networks that rely on Ethereum for security and validation. This interoperability is crucial for enabling diverse blockchain applications to collaborate and operate securely.

Additionally, Lumoz benefits from Ethereum's robust, decentralized, and tamper-resistant infrastructure, protecting the verification process against potential attacks and vulnerabilities. This integration significantly enhances Lumoz's overall security, making it a more reliable and trustworthy component within the blockchain ecosystem.

Also, Integrating with EigenLayer’s AVS, Lumoz enhances the security and profitability of its computational layer. EigenLayer’s re-staking mechanism allows users to stake their tokens on trusted validators, who then provide secure and reliable validation services. This mechanism creates a robust economic incentive structure, encouraging validators to act honestly and efficiently. By leveraging AVS, Lumoz ensures that its computational tasks are verified through a decentralized and economically secure framework, further enhancing the reliability of its services.

Conclusion

The integrity of ZKP systems hinges on the reliability of both the prover and the verifier. Lumoz, recognizing this critical aspect, has developed an AVS computational layer based on EigenLayer, featuring zkProver and zkVerifier components. These innovations address vulnerabilities in ZKP systems, ensuring secure and reliable verification processes.

Real-world examples from Linea and zkSync highlight the importance of robust verification. Lumoz's zkVerifier prevents such vulnerabilities through secure random value generation, precise state management, and formal verification methods, maintaining the integrity of the system.

Lumoz's modular computing layer enhances blockchain efficiency and scalability by dividing the blockchain into distinct layers, including the essential Prover Layer for ZK-Rollups. This design supports continuous ZKP capacity and caters to AI computational needs.


Lumoz’s node sale is live. Invite code: SCKOX


Disclaimer

This article was written by Arhat Bhagwatkar, and proofread by the Lumoz team. For institutions, to participate in the Lumoz zkVerifier node sale, please reach out to Justin on Telegram

The views and opinions expressed in this research report are those of the author and do not necessarily reflect the official policy or position of Lumoz or its affiliates. While every effort has been made to ensure the accuracy and completeness of the information presented, readers are encouraged to conduct their own research and consult with appropriate professionals before making any decisions based on the information provided in this article.

The author will not be held responsible for any loss or damage, including but not limited to, indirect or consequential loss or damage, arising from the use of, or reliance on, the information presented herein.


Thank you for reading through, and subscribe below for regular post updates.

I’d also appreciate it if you shared this with your friends, who would enjoy reading this.

You can contact me here: Twitter and LinkedIn.


Previous Research:

  1. Decoding & Democratizing Web3

  2. P2E: A Shift in Gaming Business Models

  3. Stablecoins: Is there hope?

  4. If you don't control your data why do you trust it

  5. Primer on L2 Scaling Solutions

  6. Understanding User Dynamics in DeFi

  7. Intro to Lending and Borrowing Mechanics in DeFi

  8. Part 2: DeFi Deep Dive on COMP, AAVE, and MKR

  9. Best Way to Create Value with Data in Web3

  10. Building a Decentralized Climate Finance DAO

  11. Org vs. DAOs: Governance & Growth in Modern Society

  12. ERC-4337: The Future of Ethereum Token Standards

  13. Identity Without Borders: Decoding My Online Identity

  14. Web3s 3-Wave Model of Evolution of Complex Systems

  15. Understanding Tokenomics: Case Study of dYdX

  16. DeFi Hacks Unveiled: What We've Learned from Q2

  17. Voting Mechanisms & Incentives for Governance in DAOs

  18. Uniswap’s Fee Switch Dilemma

  19. MakerDAO's Endgame: 5 Phases and 14 MIPs

  20. Liquid Staking Tokens: Can They Bounce Back?

  21. Binance Smart Chain: Luban Hard Fork

  22. crvUSD: A Stable Alternative?

  23. friend.tech: Tokenizing Incentives for "friends"

  24. MEV Endgame: Exploring Mempool Privacy Schemes

  25. Privacy Pools: Towards Practical Privacy & Compliance with Smart Contracts

  26. Rollup Roulette: Deep Dive into Shared Liquidity

  27. Modeling Player-Centric P2E (Tokenless) Tokenomics

  28. Unpacking FRAX v3: Hybrid Assets, Modular Design

  29. zkProofs & Recursive SNARKs

  30. Starport Kernel Framework for Lending Protocols

Subscribe to Arhat
Receive the latest updates directly to your inbox.
Mint this entry as an NFT to add it to your collection.
Verification
This entry has been permanently stored onchain and signed by its creator.