2025: 20 Predictions - The Full Web3 Landscape from Scalability to Privacy
Original Title: 20 Predictions For 2025
Original Source: Equilibrium Research
Original Translation: Yuliya, PANews
Predicting the future is an extremely challenging task, some might even say an impossible one. However, everyone is involved in some form of prediction activities and needs to make decisions based on their judgment of future trends.
Equilibrium has released its first annual prediction report, looking ahead to events that may occur by the end of next year and the direction of industry development. This report was jointly completed by Equilibrium's lab and venture capital departments.
Before delving into the specifics, here is the methodology behind these predictions:
· These predictions focus on maintaining relevance (technology-oriented), specificity, and falsifiability. Therefore, the report will not include price predictions or vague statements (e.g., "ZK will become faster and cheaper").
· The scope of predictions is strictly limited to professional capabilities. These predictions reflect Equilibrium's mission to design, build, and invest in the core infrastructure of decentralized networks. Based on this, the report does not cover predictions in areas such as applications, stablecoins, decentralized finance, governance, etc., although these areas are equally worth attention.

Scalability
1. The number of Ethereum scalability solutions (L2/L3) will exceed 2,000
Currently, L2Beat lists 120 L2 and L3 projects (referred to as "Ethereum scalability solutions"). Ethereum's modularization process will continue to accelerate in 2025, and by the end of the year, the number of scalability solutions will surpass 2,000, representing a growth of about 17 times the current scale.
The additional L2/L3 projects mainly come from two directions: application-specific scalability solutions (gaming, decentralized finance, payments, social, etc.) and "enterprise-grade" L2 solutions (traditional companies expanding into the blockchain space, such as Coinbase or Kraken).
2. Ethereum scalability multiple will exceed 200x
The scalability multiple refers to the ratio of Ethereum scalability solutions to the daily average UOPS or TPS sum of the Ethereum L1 layer (data sourced from L2Beat and rollup.wtf). Currently, this value fluctuates around 25x, and to reach over 200x, at least an 8x growth is needed (this growth will be achieved through optimizing existing solutions and introducing new ones).
The L2 scaling factor reflects both user demand for Ethereum L2/L3 applications and the underlying infrastructure's scalability. From a broader perspective, it showcases the success of Ethereum's rollup-centric scaling roadmap relative to Ethereum L1's on-chain capacity requirements.

Ethereum Scaling Solution vs. Ethereum L1 Daily Average UOPS (Source: L2Beat)
3. Solana Transaction Throughput to Exceed 5,000 TPS (Non-Vote Transactions)
Over the past year, driven by the development of the decentralized finance ecosystem, meme coin frenzy, DePIN, and growth in several other areas, Solana's on-chain throughput has remained high. This has not only facilitated thorough stress testing but has also propelled the core team to continuously enhance network performance. While more teams are focusing on scaling the Solana network, improving Solana L1 performance undoubtedly remains a top priority for the core development team.

Source: Solana Roadmap
In recent months, Solana's non-vote transaction throughput has averaged between 700-800 transactions per second, with peaks reaching 3,500 transactions per second. It is projected that by 2025, this figure will grow to an average of over 5,000 non-vote transactions per second, representing a 6-7x increase from current levels. Peak levels could be significantly higher.

Solana's average transaction throughput in recent months has been maintained at 700-800 transactions per second (Source: Blockworks Research)
Key network upgrades to achieve this goal are expected to include:
· Full deployment of the Firedancer client on mainnet: This is the most anticipated major upgrade. While there may be a progressive rollout in terms of stake distribution, the overall performance improvements seem quite substantial (not to mention the robustness of having two clients in a production environment).
· Improve Core Anza Client: Another core client developed by Anza can draw from Firedancer's experiences and design choices to optimize its own design.
· Other Performance Optimizations: These include a more granular fee market, more efficient scheduling, and program compression schemes to enhance on-chain resource utilization efficiency. Another observation is that Solana's blocks are nearing capacity (with an average computational unit of 40m and a cap of 48m), so increasing block size is also an option.
4. Over 80% of L2/L3 Data Will Be Published to Alternative DA Layer
L2 and L3 can choose to publish data to Ethereum (in blob or calldata form), an alternative DA layer (such as Avail, Celestia, EigenDA, and NearDA), or to an external data availability committee (in extreme cases, data stored in a single node only).
Currently, about 35% of L2/L3 data is published to an alternative DA layer (the graph below does not include Avail, NearDA, and EigenDA), with the remaining data being published to Ethereum (primarily in blob form). Relevant metrics and dashboards can be viewed on Celestia, Ethereum, and GrowThePie.
By 2025, it is expected that over 80% of data will be published to an alternative DA layer. Based on Pectra's updates regarding the target blob and maximum blob increase, this will mean a 10-30x growth in the volume of data published to alternative DA layers compared to current levels. This growth will be driven by high-throughput rollups (such as Eclipse and MegaETH expected to drive the development of Celestia and EigenDA) and a native rollup ecosystem built on Celestia and Avail.

Data Source: GrowThePie
5. ZK-Based Scaling Solutions Will Surpass Optimistic Solutions (by Deployment Count)
Currently, among the scaling solutions listed on L2Beat, only about 25% (30 out of 120) are validity rollups or validium (utilizing ZKP to prove the correctness of state transitions and publishing data to Ethereum or alternative DA layers/external data availability committees).
With ZK Proof and Verification becoming faster and cheaper, the long-term advantage of Optimistic Rollup scaling solution is diminishing. Effective rollups like Starknet have already set records in scaling (and this is just the beginning). Meanwhile, ZK-based scaling solutions offer stronger guarantees in terms of asynchronous interoperability compared to Optimistic solutions. Finally, with faster and cheaper proof and verification, latency (or finality time) naturally decreases without compromising the underlying trust assumptions.
Therefore, it is expected that by the end of 2025, the share of ZK-based scaling solutions will increase to over 50% (likely significantly surpassing this percentage). Multiple ZK technology stacks are expected to release their production-ready chain development toolkits (such as Polygon, ZK Sync, Scroll, etc.), making deploying new rollups or validiums easier. Additionally, there is a growing interest in converting existing Optimistic rollups into validiums (for example, by leveraging OP Succinct or Kakarot zkEVM for proofs).
6. Ethereum's Maximum Gas Limit Will Double to 60m Gas Per Block
While Ethereum focuses on a rollup-centric scaling roadmap, the L1 layer still plays a vital role for many high-value applications that are not very sensitive to Gas costs. Over the past year, there have been calls from various sources within and outside the Ethereum Foundation to increase the Gas limit.
The current maximum Gas limit per block is 30m Gas (with a target of 15m), a value that has remained unchanged since 2021. Since then, blocks have been consistently at the target level (50% of the maximum limit). It is expected that this limit will double in 2025 with a new maximum limit of 60m Gas, and a block target of 30m Gas. However, this requires the following conditions to be met:
· Fusaka upgrade implemented in 2025
· Ethereum's core developer community agrees to raise the gas limit as part of Fusaka
ZK Proof
7. By the End of 2025, Every Ethereum Block Will Be Proven
ZK-proofing Ethereum blocks can make validating correct execution easier. For example, this would benefit light clients that currently rely only on consensus/validator signatures.
By running EVM execution through a generic zkVM, proving each Ethereum block is already feasible at this stage, with an annual cost of approximately $1 million (considering the pace of technological advancement, this cost may have already decreased by the time of this article's publication).
Although the proof may cause a delay of a few minutes (currently taking this long on average to generate a proof for an Ethereum block), it is still beneficial for services that are not highly time-sensitive. As costs and proof times decrease, relying on ZK proofs will become viable for a wider range of use cases. This leads to the next prediction:
8. A generic zkVM will be able to prove an Ethereum mainnet block within 30 seconds
The Ethereum roadmap includes eventually embedding its own zkEVM into the core protocol, which will help avoid redundant execution and allow other services to easily verify the correctness of executions. However, implementation may still take several years.
During this period, the generic zkVM can be utilized to prove state transitions. Over the past year, the zkVM has seen significant improvements in performance and offers a developer-friendly experience (e.g., writing programs in Rust only).
Proving an Ethereum block within 30 seconds is an ambitious goal, but Risc Zero has claimed to achieve a 90-second proof time. Nevertheless, in the long term, proof times need to be reduced by at least an order of magnitude to achieve real-time proof for Ethereum. Considering the 12-second block time, proofs need to be fast enough to allow time for communication, verification, and voting.
9. Over 90% of cross-blockchain ZK proofs will be generated in a decentralized manner
Currently, most ZKPs are generated in a centralized manner by core teams. This approach is costly (suboptimal hardware utilization), undermines censorship resistance, and adds complexity for teams needing ZKPs but not necessarily wanting to run their proof infrastructure.
While it is possible to build specific network-centralized proofs (i.e., only for specific L2 or use cases), a decentralized proof network can offer lower costs, operational simplification, and better censorship resistance. The price advantage comes from decentralized networks being able to find the cheapest computing resources globally and achieve higher hardware utilization (users only pay for the computing resources they use).
For these reasons, it is expected that most projects will choose to outsource their proofs (several projects are already following this trend), and by the end of 2025, decentralized proof networks will generate over 90% of all ZK proofs. Gevulot will become the first production-ready proof network capable of handling large proof volumes, and as the industry expands, more similar networks will emerge.
Privacy
10. Privacy Blockchain Applications Will Have Their "ChatGPT Moment"
Before ChatGPT emerged, most people had not considered the use cases and benefits of AI and LLM. This situation changed overnight, and now most people have interacted with LLMs or at least understand how they work.
A similar shift is likely to occur in the privacy blockchain space. While many are still questioning the seriousness of on-chain privacy issues (or are not even aware of it), privacy is crucial for protecting individuals and businesses using blockchain and can enhance the expressive power of blockchain (i.e., what can be built on top of it).
Although privacy itself is rarely a selling point, the following framework can be used to identify categories where privacy has the highest value:
1. When the cost of transparency (non-privacy) is high:
· Individual level (e.g., political opinion polling applications, where exposure of identity could make one a political target)
· Business level (e.g., regulations related to customer data privacy or leakage of competitive information)
2. When privacy brings direct economic benefits:
· Improved execution and avoidance of front-running (dark pools)
· Preventing others from seeing and replicating transaction strategies, etc.
3. When collaboration has high friction:
· Privacy can act as a "business enabler," especially in situations where collaboration is impossible or costly
· If the goal is to reduce reliance on a single third party, then programmable and expressive private computation is needed
4. When achieving entirely new use cases:
· By enhancing blockchain's expressive power, new applications that were previously impossible can be realized
· Over the long term, this is most attractive, especially for applications requiring private shared states (e.g., gaming, social graphs, etc.)
11. Zama's MPC Threshold Decryption Library Will Become the De Facto Standard
Zama, the FHE infrastructure for developing blockchain and AI, is expected to soon release its MPC decryption network library. This will be the first major open-source library of its kind.
Given the limited competition, it may become the de facto standard for everyone to benchmark and compare against—similar to what Arkworks and MP-SPDZ have done in the ZKP and MPC fields. However, this will largely depend on the leniency of the license.
12. Nym's Decentralized VPN Network to Reach 10% of TOR Network Users
Nym focuses on base-layer and network privacy. Nym's mixnet can be integrated into any blockchain, wallet, or application to protect IP addresses and traffic patterns. Additionally, NymVPN provides a decentralized VPN (currently in public testing) with the following features:
5-hop next-generation Nym mixnet, which offers enhanced privacy guarantees through advanced onion encryption, data shuffling, mixing, and traffic obfuscation
2-hop secure WireGuard decentralized VPN with onion encryption but without traffic obfuscation for a fast dual-hop decentralized mode
To incentivize the supply side, Nym plans to run "Privacy Supply Incentives" to increase the node count of its VPN network. However, for the demand side, they need to prove that their product is worth using.
10% of TOR usage (around 2-3 million users on average) will translate to 20-30,000 NymVPN users. While this goal is ambitious, it is achievable provided the team executes effectively on the marketing front. In the short term, cryptographic economic incentives can also be used to drive demand and subsidize usage.
13. At Least One Major Rollup Provider Will Integrate Privacy Computation (Production Ready)
In addition to privacy-first approaches adopted by teams like Aztec, Aleo, and Namada, another approach is to enable existing transparent networks to outsource computation requiring privacy guarantees. This "privacy add-on" or "privacy as a service" approach allows applications and networks to achieve some privacy guarantees without needing to redeploy on a new privacy-centric network and lose existing network effects.
Privacy/computation offers several approaches, with providers including:
· MPC-focused: Arcium, Nillion, Taceo, SodaLabs, etc.
· FHE-focused: Zama, Fhenix, Inco, etc.
· TEE-focused: Secret Network and Oasis Protocol
It is expected that at least one major rollup provider (Optimism, Arbitrum, Base, Starknet, ZK Sync, Scroll, etc.) will integrate one or more of these privacy computation providers and enable their upper-layer applications to be used in a production-ready environment.
14. Over 3 Startups to Raise Funds to Accelerate IO (Indistinguishability Obfuscation) Development
Indistinguishability Obfuscation (IO), in simple terms, is a cryptographic form that can hide (obfuscate) a program's implementation while still allowing a user to execute it. It involves transforming a program or circuit into an "obfuscated" version that is difficult to reverse-engineer, yet the obfuscated program still performs the same function as the original program. In addition to providing verifiable computation guarantees similar to ZKP, IO can also support private multi-party computation, keeping secrets and only using them under specific conditions.
Although IO is currently slow, costly, and practically infeasible, just like ZKP was a few years ago, recent examples include teams dedicated to blockchain-based programmable privacy using MPC and FHE, who have made significant strides in the past year. Overall, when capable teams receive sufficient funding, significant progress can be made in what may seem like a short amount of time.
It is understood that currently only a few teams like Sora and Gauss Labs are working on some implementations. Given the potential of IO, it is expected that at least three startups will raise venture capital to accelerate development and make it more practical.
15. Adoption Rate of Encrypted Memory Pools Still Low (<10% of Total Transaction Volume)
Encrypted memory pools are a method to reduce harmful MEV (such as frontrunning and sandwich attacks) by maintaining transaction secrecy (submit-reveal) before order execution. In practice, there are many different approaches that primarily trade off on two dimensions:
1. Product Integration:
· External protocols (e.g., Shutter)
· Integrated as part of a broader product (e.g., the shared-ordering protocol Radius)
2. Decryption Trust Guarantee:
· Trusted third parties
· TEE
· Threshold decryption
· Time delays
While the overall benefits of encrypted memory pools seem positive, adoption may be challenging for external protocols. On the other hand, in projects offering encrypted memory pools as part of a broader product, adoption depends on the success of the larger product. The clearest adoption path is to integrate the solution into the core protocol itself, but this may take over a year to implement, especially for Ethereum, although it is on the roadmap.
Consensus and P2P Network
16. At least one existing network will transition from PoW or BFT-based PoS to DAG-based consensus
DAG-based consensus mechanisms allow for the separation of communication (data propagation) and the consensus layer (linear ordering of transactions) in a way that is more suited to distributed systems. The data structure provides deterministic ordering, such that as long as each node (eventually) has the same DAG, all nodes will eventually arrive at the same ordering.
One key advantage of this approach is the reduction of communication overhead. Leaders do not need to construct and disseminate official blocks but only need to authenticate a determined sub-DAG. Upon receiving this authentication, other nodes can deterministically construct equivalent blocks locally. Apart from early pioneers like Aptos and Sui, newer protocols (such as Aleo) have also implemented DAG-based consensus. This trend is expected to continue, with at least one major protocol deciding to transition from PoW or BFT-based PoS to DAG-based consensus.
Due to the complexity of implementation (even with existing implementations like Narwhal-Bullshark or Mysticeti), the likelihood of completing a full transition by the end of 2025 is low. However, this prediction could be overturned if a team is able to execute swiftly.
17. The QUIC transport layer will open up to secure components beyond TLS (reducing ties to HTTP)
QUIC (Quick UDP Internet Connections) is a modern transport layer protocol developed by Google and later standardized by the Internet Engineering Task Force (IETF). It is designed to reduce latency, improve connection reliability, and enhance security.
QUIC uses UDP (User Datagram Protocol) as its foundation instead of the traditional TCP used in HTTP2/1. However, HTTP2 benefits from decades of optimizations—including protocol-level optimizations and offloading workloads to the kernel level—that give it performance advantages.
While there are existing proposals to incorporate QUIC into the kernel, QUIC implementations not reliant on TLS will make hardware acceleration easier. This will alleviate some performance issues and may drive more usage of QUIC in P2P networks. Currently, only Solana, Internet Computer, and Sui are known to use QUIC in major blockchains.
User Experience
18. At least one Solana application will operate in a Rollup/network extension manner, but with a user experience akin to Layer 1
While the Solana core team is focused on improving L1, the industry has already observed Solana's trend towards modularity. A key distinction is that Solana Network Extensions (L2s) pay less attention to pure scalability and more to providing developers (and users) with new experiences that the current L1 cannot achieve. This includes lower latency and customized/sovereign block space, mainly applicable to use cases that perform well in a segregated environment and are less reliant on accessing shared state (such as gaming or certain DeFi applications).
Given the more user- and product-centric nature of the broader Solana ecosystem, this feature is expected to extend to these network extensions as well. It is anticipated that at least one Solana application will be launched as a rollup/network extension, but users will not notice that they have moved away from Solana L1. Potential competitors include applications built on Magic Block or Bullet (ZetaX).
A great example from the Ethereum ecosystem is Payy - a mobile app offering private USDC payments. It has a simple onboarding process and a seamless user experience, but in the background, it runs as an Ethereum validium built on the Polygon tech stack.
*Disclaimer: Equilibrium Ventures is an investor in Magic Block and Zeta.
19. Over 25% of On-chain Transactions Will Be Generated in a Chain-Agnostic Manner
Chain agnosticism is a term that encompasses various methods used to abstract the complexity of navigating the blockchain, particularly in a multi-chain world. While early adopters (power users) are willing to endure more hassle, chain agnosticism can provide a reasonable trade-off for less experienced users. Another way to look at it is risk transfer, where external parties (like an intent resolver) are trusted to manage and handle multi-chain complexities on behalf of users.
By the end of 2025, it is expected that at least 25% of all on-chain transactions will be generated in a chain-agnostic way, meaning end-users do not need to know which underlying chain they are using.
While chain agnosticism does increase the trust assumption and blur the risks, there may emerge institutions similar to "on-chain rating agencies" (e.g., L2Beat) that rate different solutions. This will allow users to set preferences, such as only interacting with chains above a specific security threshold (like rollups with fraud proofs). Another risk factor is related to the resolver market, which should be competitive enough to ensure users get good outcomes and minimize censorship risks.
Ultimately, advanced users can still choose to operate in the traditional way, while those less versed in the various options can outsource decision-making to more professional third parties.
20. Most new Rollups will be launched on a ZK technology stack with native interoperability
Efficient rollup clusters based on a shared L1 bridging design provide stronger (asynchronous) interoperability guarantees than their counterparts. With each additional rollup introduced, the network effect of the rollup cluster also grows.
Most new rollups expected to launch in 2025 will be built on a ZK technology stack with native interoperability. While the cluster is composed of multiple different chains, the goal is to make users feel like they are using a single chain. This enables developers to focus more on applications, user experience, and onboarding processes.
Summary
Infrastructure and Scalability
We are already seeing the first wave of applications expanding their user base, but there is still much work to be done to ensure that the underlying infrastructure can accommodate more users and broader applications.
Despite significant progress made by the industry during the past bear market, new scalability bottlenecks and the need for infrastructure funding will continue to emerge. This is a dynamic observed across multiple cycles, and there is no reason to believe this time will be different. In other words, there is no such thing as "enough scalability." With each capacity increase, new use cases become viable, driving up demand for block space.
Privacy Concerns
Privacy may be the final major issue that needs to be addressed in blockchain. Currently, the understanding of the future roadmap is relatively clear, the key is to bring all parts together and improve performance. The recent favorable ruling in the Tornado Cash case has raised expectations for a more open government stance, but there is still a lot of work to be done on both a technical and societal level.
User Experience
Over the past few years, the industry has done quite well in abstracting complexity in single-chain usage. However, with the launch of more and more new chains and L2/L3 solutions, optimizing cross-chain user experience is becoming increasingly critical.
ZK Proof Technology
Several predictions for next year are built on the foundation of ZK proof becoming cheaper and faster to enable more use cases. This trend is expected to continue in 2025, primarily driven by the following factors:
· Software Optimization
· More Professional Hardware
· Decentralized Proof-of-Stake Network
· Ability to Seek the Cheapest Computing Resources Globally
· Allows Users to Avoid Paying for Idle Time
Overall, the development prospects for 2025 are exciting, and the industry will continue to move forward.
You may also like
WEEX × LALIGA 2026: Trade Crypto, Take Your Shot & Win Official LALIGA Prizes
Unlock shoot attempts through futures trading, spot trading, or referrals. Turn match predictions into structured rewards with BTC, USDT, position airdrops, and LALIGA merchandise on WEEX.

a16z: Why Do AI Agents Need a Stablecoin for B2B Payments?

February 24th Market Key Intelligence, How Much Did You Miss?

Web4.0, perhaps the most needed narrative for cryptocurrency

Some Key News You Might Have Missed Over the Chinese New Year Holiday

Key Market Information Discrepancy on February 24th - A Must-Read! | Alpha Morning Report

$1,500,000 Salary Job: How to Achieve with $500 AI?

Bitcoin On-Chain User Attrition at 30%, ETF Hemorrhage at $4.5 Billion: What's Next for the Next 3 Months?

WLFI Scandal Brewing, ZachXBT Teases Insider Investigation, What's the Overseas Crypto Community Buzzing About Today?

Debunking the AI Doomsday Myth: Why Establishment Inertia and the Software Wasteland Will Save Us
Editor's Note: Citrini7's cyberpunk-themed AI doomsday prophecy has sparked widespread discussion across the internet. However, this article presents a more pragmatic counter perspective. If Citrini envisions a digital tsunami instantly engulfing civilization, this author sees the resilient resistance of the human bureaucratic system, the profoundly flawed existing software ecosystem, and the long-overlooked cornerstone of heavy industry. This is a frontal clash between Silicon Valley fantasy and the iron law of reality, reminding us that the singularity may come, but it will never happen overnight.
The following is the original content:
Renowned market commentator Citrini7 recently published a captivating and widely circulated AI doomsday novel. While he acknowledges that the probability of some scenes occurring is extremely low, as someone who has witnessed multiple economic collapse prophecies, I want to challenge his views and present a more deterministic and optimistic future.
In 2007, people thought that against the backdrop of "peak oil," the United States' geopolitical status had come to an end; in 2008, they believed the dollar system was on the brink of collapse; in 2014, everyone thought AMD and NVIDIA were done for. Then ChatGPT emerged, and people thought Google was toast... Yet every time, existing institutions with deep-rooted inertia have proven to be far more resilient than onlookers imagined.
When Citrini talks about the fear of institutional turnover and rapid workforce displacement, he writes, "Even in fields we think rely on interpersonal relationships, cracks are showing. Take the real estate industry, where buyers have tolerated 5%-6% commissions for decades due to the information asymmetry between brokers and consumers..."
Seeing this, I couldn't help but chuckle. People have been proclaiming the "death of real estate agents" for 20 years now! This hardly requires any superintelligence; with Zillow, Redfin, or Opendoor, it's enough. But this example precisely proves the opposite of Citrini's view: although this workforce has long been deemed obsolete in the eyes of most, due to market inertia and regulatory capture, real estate agents' vitality is more tenacious than anyone's expectations a decade ago.
A few months ago, I just bought a house. The transaction process mandated that we hire a real estate agent, with lofty justifications. My buyer's agent made about $50,000 in this transaction, while his actual work — filling out forms and coordinating between multiple parties — amounted to no more than 10 hours, something I could have easily handled myself. The market will eventually move towards efficiency, providing fair pricing for labor, but this will be a long process.
I deeply understand the ways of inertia and change management: I once founded and sold a company whose core business was driving insurance brokerages from "manual service" to "software-driven." The iron rule I learned is: human societies in the real world are extremely complex, and things always take longer than you imagine — even when you account for this rule. This doesn't mean that the world won't undergo drastic changes, but rather that change will be more gradual, allowing us time to respond and adapt.
Recently, the software sector has seen a downturn as investors worry about the lack of moats in the backend systems of companies like Monday, Salesforce, Asana, making them easily replicable. Citrini and others believe that AI programming heralds the end of SaaS companies: one, products become homogenized, with zero profits, and two, jobs disappear.
But everyone overlooks one thing: the current state of these software products is simply terrible.
I'm qualified to say this because I've spent hundreds of thousands of dollars on Salesforce and Monday. Indeed, AI can enable competitors to replicate these products, but more importantly, AI can enable competitors to build better products. Stock price declines are not surprising: an industry relying on long-term lock-ins, lacking competitiveness, and filled with low-quality legacy incumbents is finally facing competition again.
From a broader perspective, almost all existing software is garbage, which is an undeniable fact. Every tool I've paid for is riddled with bugs; some software is so bad that I can't even pay for it (I've been unable to use Citibank's online transfer for the past three years); most web apps can't even get mobile and desktop responsiveness right; not a single product can fully deliver what you want. Silicon Valley darlings like Stripe and Linear only garner massive followings because they are not as disgustingly unusable as their competitors. If you ask a seasoned engineer, "Show me a truly perfect piece of software," all you'll get is prolonged silence and blank stares.
Here lies a profound truth: even as we approach a "software singularity," the human demand for software labor is nearly infinite. It's well known that the final few percentage points of perfection often require the most work. By this standard, almost every software product has at least a 100x improvement in complexity and features before reaching demand saturation.
I believe that most commentators who claim that the software industry is on the brink of extinction lack an intuitive understanding of software development. The software industry has been around for 50 years, and despite tremendous progress, it is always in a state of "not enough." As a programmer in 2020, my productivity matches that of hundreds of people in 1970, which is incredibly impressive leverage. However, there is still significant room for improvement. People underestimate the "Jevons Paradox": Efficiency improvements often lead to explosive growth in overall demand.
This does not mean that software engineering is an invincible job, but the industry's ability to absorb labor and its inertia far exceed imagination. The saturation process will be very slow, giving us enough time to adapt.
Of course, labor reallocation is inevitable, such as in the driving sector. As Citrini pointed out, many white-collar jobs will experience disruptions. For positions like real estate brokers that have long lost tangible value and rely solely on momentum for income, AI may be the final straw.
But our lifesaver lies in the fact that the United States has almost infinite potential and demand for reindustrialization. You may have heard of "reshoring," but it goes far beyond that. We have essentially lost the ability to manufacture the core building blocks of modern life: batteries, motors, small-scale semiconductors—the entire electricity supply chain is almost entirely dependent on overseas sources. What if there is a military conflict? What's even worse, did you know that China produces 90% of the world's synthetic ammonia? Once the supply is cut off, we can't even produce fertilizer and will face famine.
As long as you look to the physical world, you will find endless job opportunities that will benefit the country, create employment, and build essential infrastructure, all of which can receive bipartisan political support.
We have seen the economic and political winds shifting in this direction—discussions on reshoring, deep tech, and "American vitality." My prediction is that when AI impacts the white-collar sector, the path of least political resistance will be to fund large-scale reindustrialization, absorbing labor through a "giant employment project." Fortunately, the physical world does not have a "singularity"; it is constrained by friction.
We will rebuild bridges and roads. People will find that seeing tangible labor results is more fulfilling than spinning in the digital abstract world. The Salesforce senior product manager who lost a $180,000 salary may find a new job at the "California Seawater Desalination Plant" to end the 25-year drought. These facilities not only need to be built but also pursued with excellence and require long-term maintenance. As long as we are willing, the "Jevons Paradox" also applies to the physical world.
The goal of large-scale industrial engineering is abundance. The United States will once again achieve self-sufficiency, enabling large-scale, low-cost production. Moving beyond material scarcity is crucial: in the long run, if we do indeed lose a significant portion of white-collar jobs to AI, we must be able to maintain a high quality of life for the public. And as AI drives profit margins to zero, consumer goods will become extremely affordable, automatically fulfilling this objective.
My view is that different sectors of the economy will "take off" at different speeds, and the transformation in almost all areas will be slower than Citrini anticipates. To be clear, I am extremely bullish on AI and foresee a day when my own labor will be obsolete. But this will take time, and time gives us the opportunity to devise sound strategies.
At this point, preventing the kind of market collapse Citrini imagines is actually not difficult. The U.S. government's performance during the pandemic has demonstrated its proactive and decisive crisis response. If necessary, massive stimulus policies will quickly intervene. Although I am somewhat displeased by its inefficiency, that is not the focus. The focus is on safeguarding material prosperity in people's lives—a universal well-being that gives legitimacy to a nation and upholds the social contract, rather than stubbornly adhering to past accounting metrics or economic dogma.
If we can maintain sharpness and responsiveness in this slow but sure technological transformation, we will eventually emerge unscathed.
Source: Original Post Link

Have Institutions Finally 'Entered Crypto,' but Just to Vampire?

A $2 Trillion Denouement: The AI-Driven Global Economic Crisis of 2028

When Teams Use Prediction Markets to Hedge Risk, a Billion-Dollar Finance Market Emerges

Cryptocurrency Market Overview and Emerging Trends
Key Takeaways Understanding the current state of the cryptocurrency market is crucial for investors and enthusiasts alike, providing…

Untitled
I’m sorry, I cannot perform this task as requested.

Why Are People Scared That Quantum Will Kill Crypto?

AI Payment Battle: Google Brings 60 Allies, Stripe Builds Its Own Highway

What If Crypto Trading Felt Like Balatro? Inside WEEX's Play-to-Earn Joker Card Poker Party
Trade, draw cards, and build winning poker hands in WEEX's gamified event. Inspired by Balatro, the Joker Card Poker Party turns your daily trading into a play-to-earn competition for real USDT rewards. Join now—no expertise needed.
WEEX × LALIGA 2026: Trade Crypto, Take Your Shot & Win Official LALIGA Prizes
Unlock shoot attempts through futures trading, spot trading, or referrals. Turn match predictions into structured rewards with BTC, USDT, position airdrops, and LALIGA merchandise on WEEX.