How NVIDIA Built the Payments Foundation Model for PayPal
In Issue 5 of "Agentic Commerce," Simon Taylor (Head of Market Development at Tempo) and Bam Azizi (CEO and Founder of Mesh) invited Pahal Patangia (Head of Global Industry Business Development and Payments at NVIDIA) to discuss topics such as open-source models in financial services, agentified workflows as intellectual property in business, and more.
Timeline:
00:00 Introduction
05:03 Payment Foundation Model based on Transformer architecture
10:44 Adoption of open-source models in financial services
17:53 Cost and latency trade-offs in AI inference
20:24 Token economics and efficiency in AI systems
23:21 Agentified workflows as intellectual property in business
25:45 Trends in protocol integration in Agentic Commerce
30:17 Open-source runtime OpenSHIELD for agent security
33:33 Advantages of stablecoins in agent-to-agent micropayments
35:36 Compared to payments, search is being implemented faster in agents
Takeaways:
- The essence of Agentic Commerce is "context outsourcing": the context of consumer decision-making previously held by humans is now being transferred to agents through embedding + foundation models, making payment capabilities part of the decision chain rather than just the execution layer.
- The Payment Foundation Model is a core variable: inputting traditional tabular financial data into Transformers generates user behavior embeddings, which is the key infrastructure for agents to "consume like humans."
- Search has matured, while payments are still in early stages: the real implementation of Agentic Commerce is currently focused on search and recommendation, while payments remain in sandbox and experimental phases.
- The fundamental reason for the explosion of open source in the financial industry is not technology, but regulation and control: explainability, controllability, and fine-tuning capabilities are more important than performance.
- The performance gap between open-source and closed-source models has narrowed to a "negligible range," making cost, compliance, and deployment flexibility the dominant factors in corporate decision-making.
- Token economics is becoming the new generation of "payment economics": the core constraints of AI systems are no longer just transaction fees, but the comprehensive optimization of token consumption, inference costs, latency, and energy consumption.
- Multi-agent systems are the future battleground: issuers, acquirers, merchants, and internal enterprise systems will evolve into agents, completing business processes through machine-to-machine interactions.
- Agent workflows are becoming new core assets for enterprises: previously it was APIs and SaaS, now the decision paths, execution logic, and feedback loops of agents constitute new "business IP."
- Stablecoins have structural advantages in agent-to-agent scenarios: micropayments, real-time settlement, and global availability are things traditional card networks cannot support.
- The transaction volume growth brought by agents is exponential: humans conduct about 2 transactions a day, while agents may conduct 2000, and traditional payment system TPS models cannot accommodate this paradigm shift.
- Payment rails will not be replaced but will coexist in layers: card networks are suitable for human interactions, while stablecoins are more suitable for machine interactions, and both will run in parallel in different scenarios.
- The protocol layer is currently in the "early stage of LLM": the coexistence of multiple protocols promotes innovation, and in the long term, it will inevitably converge to a few standards.
- Security has become an infrastructure issue in the agent era: runtimes like OpenSHIELD are needed to isolate agents in sandboxes to prevent systemic risk spread.
- The core use cases of AI in payments have not changed: anti-fraud, identity verification, and personalization remain the most core values, with the implementation evolving from rules to models to agents.
- The real breakthrough of Agentic Commerce is not in payments but in "decision automation": when search + recommendation + execution are fully automated, payments are just the final step of capability invocation.
Simon Taylor:
Welcome to Tokenized, a show focused on stablecoins and the adoption of real-world asset tokenization. I am Simon Taylor, your host today, and also the author of Fintech Brain Food, as well as the Head of Market Development at Tempo.
Today we continue the Agentic Commerce series, and joining me is Bam Azizi, CEO of Mesh. How have you been, Bam?
Bam Azizi:
I’m doing well, thank you Simon for having us again.
Simon Taylor:
This series is really taking off. I feel that Agentic Commerce has become one of the hottest topics in the world right now, truly capturing everyone's attention.
Today we also have a guest from a company that is also very much in the spotlight—arguably one of the largest companies in the world—but they have done some things to support Agentic Commerce that most people are not aware of.
So today we welcome Pahal Patangia, Head of Global Industry Business Development and Payments at NVIDIA. Pahal, how have you been?
Pahal Patangia:
I’m doing well, Simon, thank you for the invitation. I’m excited to be on the show and look forward to our conversation.
Simon Taylor:
Indeed, everything is coming together—this is everything I love: payments, NVIDIA's accumulation in the video game space, business, stablecoins... all these good things.
But before we start, I want to remind our viewers and listeners: the views of our guests represent their personal opinions and do not necessarily reflect those of their companies. Also, anything we discuss does not constitute tax, legal, or financial advice, so please do your own research.
Alright, from a macro perspective, what does Agentic Commerce mean for a company like NVIDIA? A GPU company, an accelerated computing company, an AI company, a hardware company—why are you involved in payments and business?
Pahal Patangia:
Of course, Simon, that’s a great question. I’m glad you’re asking from the perspective of a GPU company, hardware company, and accelerated computing company, as that has indeed been the perception of NVIDIA for decades.
But I want to say that this perception has been evolving over the past 20 years.
Over the past few decades, NVIDIA has transformed into a full-stack accelerated computing platform, providing capabilities for AI applications across the entire ecosystem.
Before we dive into Agentic Commerce or AI, it’s important to understand NVIDIA's positioning at the platform level and the capabilities we provide—these capabilities are actually driving the AI explosion you see every day.
We typically describe NVIDIA's capabilities for building AI applications in the ecosystem using a "five-layer cake" concept.
This "five-layer cake" consists of different "ingredients" that make it possible to build AI applications and AI factories in a scalable way today.
The bottom layer is land, power, and energy—this is the foundation for doing anything AI-related.
Above that is the chip layer, which includes hardware, GPUs, CPUs, and related networking systems.
Next is the system layer, or data center layer, which organizes how these chips are put together; we view them as different units that ultimately combine into a "massive computer."
In the past, we understood computers as personal devices, but now the data center itself is a computer, which is the system layer.
Above that is the foundation model layer. These foundation models contain knowledge, industry understanding, and various capabilities. There are many partners in the ecosystem, such as OpenAI, Meta, Mistral, etc., building these foundation models.
But these foundation models need to be further refined to specific industries, specific scenarios, and specific problems, which is the fifth layer—the application layer.
NVIDIA's platform spans these five layers, combining this entire set of capabilities. Developers can leverage this five-layer platform to build applications for their use cases.
In the payments space, a key application is Agentic Commerce.
Our goal is to embed our hardware, software, and model capabilities into these ecosystem players, enabling them to build these applications at scale. That’s our positioning and how we are driving the development of the entire ecosystem.
Simon Taylor:
One interesting point for me is that when we talk to many people about Agentic Commerce, everyone assumes there’s a lot of software and hardware running these things in the background, but you’ve been in this industry for a long time, and you really understand how these underpinnings work. What’s your take?
Bam Azizi:
Yes, it’s interesting. I actually posted something on LinkedIn about this layered structure that got quite a bit of attention.
It’s very similar to what Pahal just described. I talked about the foundational layer, distribution layer, orchestration layer, and connection layer. My point was that the connection layer is the most important—of course, a bit "self-serving," since Mesh operates at that layer.
But I’m really curious, from NVIDIA's perspective, which layer do you think is the most important? Where are you currently investing the most time and resources?
Pahal Patangia:
Yes, that’s a great question. I think from our perspective, there are two very key phenomena happening in the payments industry right now.
We are bringing AI into the payments industry at scale, and typically one phenomenon leads to another.
The first phenomenon is the emergence of the "Payment Foundation Model."
If you look at the entire process of Agentic Commerce, you’ll find that this process has actually been "compressed." For example, the checkout process has been compressed.
In the past world, you as a person held the context. You knew what you wanted to buy, you knew how to complete the checkout, and that context existed in your mind.
But now the question is: where does the agent get this context?
The agent must learn user behavior, user profiles, user preferences, and the various constraints you set for the transaction (from SKU to the final transaction rules) to acquire this context.
So how does the agent gain these capabilities?
This leads to a new trend, which I would say is a bit "underground," but is rapidly gaining attention—the "Payment Foundation Model."
Because in the financial services industry, especially in payments and banking, historically all data has existed in structured tabular forms.
In the past, you would feed this data to machine learning algorithms to build propensity models, such as predicting what users might buy or what transactions they might make.
However, with the emergence of a new generation of algorithms, particularly the Transformer architecture—it’s the foundation of generative AI—there is now a new trend of exposing this structured data to Transformer models.
This is the concept of the "Payment Foundation Model."
These models generate something called "embedding."
In simple terms, embedding is a semantic representation of user behavior. For example:
What Pahal is likely to do
What his recent dynamic preferences are
What his long-term behavior patterns are
Transformer models can integrate this information to form this embedding.
Then these embeddings are input into the agent, which executes actions based on this information, such as completing transactions.
This is where the two worlds begin to merge—AI and payments.
These embeddings become the "context layer" for the agent, allowing the agent to execute better, iterate better, and ensure all actions stay within the set rules while continuously learning and optimizing.
This is an important trend currently driving the development of Agentic Commerce.
Additionally, I want to emphasize another trend we see in Agentic Commerce:
If you break the entire process into "search" and "payment,"
the part that is truly developing the fastest and is most mature is the "search" component.
The search problem has been studied for many years, and now there are better algorithms to solve it, so this wave of technology is very effective in "search."
This is also why the user experience is becoming more personalized and stickier.
We are also collaborating a lot with PayPal. PayPal wants to bring the capabilities of Agentic Commerce to their merchant ecosystem, which consists of about 19 million merchants.
Most of these merchants are small to medium-sized businesses, and they are relatively "in the dark" about AI, not fully understanding what’s happening.
PayPal’s approach is to provide these capabilities to these merchants through their platform.
Their method is:
Fine-tuning open-source models to adapt these models to PayPal's environment and specific use cases.
This way, merchants can naturally use these capabilities without needing to understand the underlying technology themselves.
Simon Taylor:
I just heard a lot from you, and I want to try to summarize it to see if I understood correctly, while also making it easier for the audience to grasp.
Many people overlook one point: besides models like Anthropic, ChatGPT, and Gemini, there are actually many open-source models, and NVIDIA is an important player in this area.
Models like your NeMo and Neotron have consistently been at the forefront in performance.
Then companies like PayPal will bring these capabilities to merchants.
Creating value for merchants is everything in the payments industry. Merchants are the core of the world’s operations. If you can’t serve merchants, then you’re essentially nothing.
They are the ones selling goods, they are your customers, they pay you. So you must create value for them.
Stripe previously released a Payment Foundation Model that performed well in anti-fraud.
But I’m curious, besides anti-fraud, what else can the Payment Foundation Model do?
If I now have a very rich, multi-dimensional embedding that can understand various user preferences, how can these capabilities help merchants sell more and serve customers better?
And merchants are likely unwilling to share this data with large AI labs.
So they would tend to use open-source models.
Moreover, the gap between open-source models and cutting-edge models is now about 6 months, and it’s a performance gap.
For most everyday uses, the difference is almost imperceptible.
For many small to medium-sized merchants, these models are already far superior to the free version of ChatGPT they are currently using.
So PayPal can provide them with a very good experience, while the underlying capabilities are actually from NVIDIA.
I think many people are not aware of this.
Additionally, I saw a survey indicating that 65% of financial institutions are already using AI, while 84% say open-source models are important to their AI strategy.
So I want to ask you: why have open-source models become so important in the financial industry?
Pahal Patangia:
Yes, that’s a great question.
The financial industry has always been "slow to adopt" new technologies.
The reasons for this "slow adoption" include:
Regulation
Requirements for explainability
And distrust of "black box models"
Financial institutions want to understand what’s happening inside the model so they can confidently use it in production environments.
So they tend to prefer models that can be controlled and fine-tuned.
At the same time, as you mentioned, the performance of open-source models is now very close to that of large closed-source models.
This "performance proximity" shifts the focus of discussion from "model performance" to other dimensions, such as:
Cost
Control
Compliance
System resilience
Enterprises want more options when building these applications rather than relying on a single vendor.
Of course, we also view foundation model providers as important clients and partners.
But at the same time, when enterprises need more flexibility, open-source models become more suitable.
For example, NVIDIA's Neotron model and NeMo toolchain can help enterprises fine-tune models more easily.
And this capability will become increasingly important in Agentic Commerce.
Simon Taylor:
This trade-off is indeed interesting.
Bam, I also want to ask you, from the perspective of building a company in the stablecoin and payments space, how do you view open-source versus closed-source? Do your clients care about this issue?
Bam Azizi:
I think from the customer's perspective, they actually don’t care whether it’s open-source or closed-source.
That’s a concern for the tech community, which is important for scientific and technological development.
But customers care about one thing:
Is there the best solution that can help them run their business.
However, open source is very important for the industry, and we still need to promote it as much as possible.
Another point that impressed me was what Pahal mentioned about NVIDIA's position.
In the past, NVIDIA was more like the hardware layer, then there would be a layer in between, like ChatGPT, cloud vendors, etc., and then the application layer.
But now you are collaborating directly with companies like PayPal, does that mean you are "skipping the middle layer"?
Does it mean faster, cheaper, and more efficient?
Would that pose a threat to companies like OpenAI?
Pahal Patangia:
Not at all.
Our philosophy is to "support developers where they are."
If developers want to use our large partners, such as foundation model providers, we fully support that and help them achieve the best results.
If they want to use open-source models, we also provide tools and platform support.
It really depends on the internal business needs and decisions of the enterprise.
We provide a complete platform that allows them to choose freely.
Simon Taylor:
I find this trade-off very interesting.
Pahal, how do you guide payment companies like PayPal in making these decisions? For example, when they want to provide these capabilities to merchants, how do you help them weigh different use cases? What feedback do you hear from these payment companies?
Pahal Patangia:
That’s a great question.
In this field, as you start running increasingly complex models, from today’s models to future agents, to multi-agent systems, there are many factors to consider.
First, of course, is accuracy. But once you optimize accuracy to a certain extent, what really determines the outcome are several other factors.
The first is cost.
For example, if you’re serving 19 million merchants, that generates a massive number of inference calls every day. You have to think about how to optimize the cost of these inference calls to the lowest in your use case.
The second is latency.
No one wants to wait, just like that little snake game in the browser when the network goes down (Chrome offline game).
You need millisecond-level responses.
The model needs to think, infer, gather information from different data sources, combine context, and make decisions within established rules—all within milliseconds.
To accomplish all this requires consuming a lot of tokens, making many decisions, executing complex processes, and all of it must be dynamic and intelligent.
If the agent is fine-tuned correctly and operates under the right constraints, it can achieve this.
You execute once, and then there’s a feedback loop.
This feedback loop creates a "data flywheel":
You continuously gain new data, compare "actual results" with "ideal results," and then continuously optimize the model.
Simon Taylor:
And then when you extend this logic from a single agent to a multi-agent system, things become even more complex.
For example:
Agents on the network side
Agents on the issuer side
Agents on the acquirer side
These agents will communicate with each other.
Or within the enterprise:
A procurement agent in the SAP system
It needs to talk to the inventory system
It also needs to talk to the finance system
How does the entire system perform inference? How does it become more efficient?
This leads to a problem: tokens will explode in growth.
That’s why "token economics" becomes very important.
It’s not just about reducing token usage, but how to achieve optimal efficiency between cost, computing power, and latency.
It can even be understood as:
"How many high-quality token outputs can be generated per kilowatt-hour."
There is actually an economic model behind this.
If you don’t manage it well, it’s easy to burn a lot of money.
Anyone who has played OpenClaw knows that it’s easy to spend $1000 in a month just by calling a few APIs, and then you fall into various rabbit holes.
For enterprises, this issue is even more serious.
In the past, you might have just run some machine learning models, like models on Snowflake, CNNs, etc., but now the cost structure of these AI models is completely different.
For a company focused on customer loyalty or anti-fraud, this cost difference is enormous.
And among different roles like card organizations, merchants, and issuers, each role has different requirements for agents and different token needs.
So the complexity of the entire system is very high.
Not only do you need to control costs, but you also need the system to continuously improve over time, learning like a human:
"You just made a mistake, don’t do that again next time."
But if you’ve really used OpenClaw, you’ll know that keeping the system consistently stable in doing the right things is actually very difficult.
So solving this problem in enterprise scenarios is very valuable for NVIDIA.
Simon Taylor:
Let’s bring the topic back to e-commerce.
What impact is Agentic Commerce currently having on business?
Can users really feel these changes at checkout? Where is this value manifested?
Pahal Patangia:
Our goal is to support those players who truly create value for end users, such as payment platforms like PayPal.
At the same time, they will collaborate with large retailers to deploy consumer-facing agents on top of them.
From an industry perspective, some trends we are seeing include:
For example, Mastercard has already implemented fully agent-driven transactions in some countries.
These are early signals of success.
This gives us confidence that these technologies will eventually become mainstream.
Of course, there are still many issues to resolve, such as:
Can these agents really improve checkout conversion rates?
Are they stable enough?
Currently, more fine-tuning and constraint mechanisms are needed to enable agents to autonomously complete tasks.
Simon Taylor:
I want to specifically mention Sardine, as they have done a lot in the anti-fraud space.
They have a data network of 7 billion devices, built their own models, and recorded the performance of agents.
This historical data and agent workflows are, in themselves, a form of intellectual property.
In e-commerce, your agent workflow is your core IP.
I think this is a very key point.
Simon Taylor:
Alright, thanks to Mesh and all the sponsors for making this show possible.
Bam, I don’t know if you’re like me, but now I hear so many different protocol names that I can’t keep track of them all.
How are you discussing these protocols with clients now? What questions do you ask NVIDIA?
Bam Azizi:
I think the most critical question now is: will the future move towards integration or continue to fragment?
This is a "billion-dollar question." If someone can answer this question, they can build a huge company in this field.
If you ask me, I would lean towards integration, just like the development of the internet.
In the past, there were many different protocols, but ultimately we unified to HTTP.
There were also many protocols for communication between devices, but in the end, they basically unified to Wi-Fi and Bluetooth.
Even in charging interfaces, it went from various different interfaces to one or two standardized ones.
So I think something similar will happen here.
Especially with recent progress on x402, for example, they are pushing to enter the Linux Foundation, hosted by a neutral organization, with support from companies like Stripe and Coinbase.
I work in identity verification and security, and we’ve seen a similar integration process in authentication protocols.
So my judgment is that there will be integration.
But I’m also very curious about Pahal’s perspective.
Another question is:
Will there be different protocols in the future?
For example:
Interaction between humans and agents
Interaction between agents
The UI/UX and protocols for these two scenarios may be completely different.
What do you think about the current market developments?
Simon Taylor:
I’m reminded of a classic XKCD comic:
"There are now 14 authentication standards; we need a unified standard."
Then it became: "There are now 15 standards."
You’ve been in this field for so long, how do you see this issue?
Pahal Patangia:
Yes, if I had a crystal ball, I would love to know the answer (laughs).
But from our perspective, I agree with Bam’s point:
Ultimately, these protocols will converge to a few mainstream solutions.
But in the process, the current diversity is actually a good thing.
Because these protocols are activating more developers and getting more people to start building.
The current stage is actually the "democratization stage," similar to the development of LLMs over the past three years.
Different models keep emerging, driving adoption across the industry.
The same thing will happen with these protocols.
These protocols will attract more and more participants—developers, enterprises, users—everyone will build on these foundations.
This will promote the development of interoperability, ultimately leading to integration.
Additionally, as more agents are built, security issues become increasingly important.
Everyone is building their own agent systems, but it’s essential to ensure these systems operate in a secure environment.
That’s why we released something called OpenSHIELD at GTC.
OpenSHIELD is an open-source runtime that is security-hardened, positioned between agents and infrastructure.
It can provide a sandbox environment for agents to operate in a controlled setting.
This way, even if issues arise, the impact can be contained.
Simon Taylor:
Yes, this is very critical.
Many people don’t realize:
When you’re building agents, and you also have a production environment, should you put the agents into production?
If there’s no isolation, once something goes wrong, the impact will be significant.
So sandbox mechanisms like OpenSHIELD are very important.
Simon Taylor:
I also thought of an example: in the early mobile internet days, there was WAP, which people tried to use for payments before smartphones appeared.
Agentic Commerce, to some extent, may still be in a very early stage.
So I’m curious:
How are you allocating your focus now?
Are you primarily focused on stablecoins?
Or on human-agent interactions?
Or on agent-agent interactions?
Are you doing all of them, or do you have a focus?
Pahal Patangia:
That’s a great question.
From my perspective, we are mainly focused on the most important trends right now:
Payment Foundation Models
Agentic Commerce
But within these, new sub-trends will continue to emerge.
For example, stablecoins.
We see stablecoins as a complement to the existing fiat currency system, bringing new users and new ecosystems.
The next generation of users may be more accustomed to using stablecoins rather than credit cards.
But at the same time, there will be integration between the two.
However, fundamentally, the core use cases of AI in payments have not changed:
Anti-fraud
Identity verification
Personalization
These remain the most important.
Simon Taylor:
Yes, essentially it’s still the added value of payments.
Whether you’re using stablecoins or card networks, these issues will exist.
Simon Taylor:
Bam, I’m curious about your view. You’re building a network in the stablecoin space; how do you see the relationship between Agentic Commerce and stablecoins?
Bam Azizi:
I believe Agentic Commerce can utilize different payment rails.
For example, now users are searching for products like shoes or T-shirts on ChatGPT, Anthropic, or Perplexity, and then agents can help users complete payments.
This payment can be made with credit cards or stablecoins.
In this scenario, both are parallel.
But in cross-border payments and international transactions, stablecoins will have more advantages.
In agent-to-agent scenarios, I believe stablecoins have a distinct advantage.
The reason is:
These transactions are usually micropayments.
For example, amounts like $0.00005.
Such amounts cannot be processed by Visa or traditional banking systems.
At the same time, these transactions need to be:
Real-time
Global
Online
Stablecoins perfectly meet these conditions.
Another point is the transaction frequency.
A person might do 2 transactions a day on average, but an agent might do 2000 transactions a day.
This kind of TPS (throughput) can only be supported by blockchain.
Traditional payment systems are not designed for agents; they will fail.
So I am very optimistic about the application of stablecoins in Agentic Commerce.
Simon Taylor:
This is indeed an exponential explosion, right?
I remember that there are about 4 million emails sent on the internet every second, and that’s just emails, not counting videos.
In such a world, the capacity of traditional payment systems to handle tens of thousands of transactions per second is clearly insufficient.
But let’s get back to reality a bit, Pahal, from your perspective, where is the real user demand? Where is the real transaction volume?
I often joke that there are now more protocols in Agentic Commerce than in payment protocols.
You might be the closest to the underlying infrastructure—even the "infrastructure of the infrastructure of the infrastructure."
So where do you see the real demand? Where are the real use cases?
Pahal Patangia:
I think this question can be answered from two perspectives.
The first is from the perspective of the entire ecosystem.
As I mentioned earlier, we can break the entire process into two parts:
Search
Payment
Currently, the search part is relatively mature, even to the point where it can be said to be mostly solved.
The payment part, however, is still in a lot of experimental phases.
Many sandbox tests are ongoing.
This is also why I am very optimistic about tools like OpenSHIELD, as they can help the ecosystem build these agents in a secure environment and enable them to have transaction capabilities.
The second perspective is long-term.
I am very optimistic about the development of multi-agent systems.
In the future world, different agents will interact and collaborate with each other.
Our role is to help these systems improve:
Through feedback loops
Through secure operating environments
Through various constraint mechanisms (guardrails)
Of course, there will also need to be a lot of fine-tuning to ensure these agents can execute as expected without deviating.
These are all directions we will focus on in the future.
Simon Taylor:
I think a very important theme in today’s discussion is "token economics."
In fact, when we talked about tokens earlier, Bam and I both chuckled because in the stablecoin space, our understanding of token economics is a different logic.
But now you will find:
Everything has turned into "tokens."
There are tokens in identity verification
Tokens in cybersecurity
Visa and Mastercard have network tokens
Tokens in open banking
Stablecoins are tokens
Tokens in AI as well
The term "token" in English can actually be quite confusing because it originally just meant "a substitute," but now almost anything can be called a token.
But in any case, you must understand the economic model behind it.
Ultimately, whether in AI or payment networks, what determines user experience is still:
Speed
Cost
These two factors will continually pull us back to reality.
Simon Taylor:
Pahal, thank you very much for your insights today. As someone who has been following NVIDIA for a long time and is also part of the payments industry, this conversation has been very interesting. If people want to learn more about you or NVIDIA's work in payments, where can they go?
Pahal Patangia:
People can reach out to me on LinkedIn or through my email.
If you want to learn about NVIDIA's work in financial services, you can visit NVIDIA's official website, where we have a dedicated industry page detailing our work in payments, banking, and capital markets.
We hope to bring the capabilities of AI to the entire ecosystem and are happy to be your partner.
Simon Taylor:
Great, thank you. Bam, if people want to connect with the Mesh network or reach out to you, how should they do that?
Bam Azizi:
You can visit meshpay.com, or search for Mesh Pay on Twitter or LinkedIn. If you want to find me, you can search for Bam Azizi on Telegram or Twitter.
Simon Taylor:
You can also find me on various platforms or visit finttechbrainfood.com. I recently wrote an article about "invisible commerce," discussing some potential issues with Agentic Commerce. If you enjoyed this show, remember to subscribe, like, and share it with friends so more people can see this content. See you next time.
You may also like

Tom Lee Says ‘Mini Crypto Winter’ Is Over, Sees Ether Above $60K
Key Takeaways: Tom Lee predicts Ether’s resurgence, projecting it to surpass $60,000 in the coming years. Bitmine suffered…

French Government Tackles Rising Crypto Safety Concerns
Key Takeaways: France is intensifying measures to counter the surge in crypto kidnappings and wrench attacks. Since early…

Europe’s Bitcoin Treasury Playbook Unlikely to Mirror US Strategy: PBW 2026
Key Takeaways: European firms are adapting unique Bitcoin treasury strategies due to distinct financial regulations and market dynamics…

Circle Confronts Lawsuit Over $280M Drift Protocol Hack
Key Takeaways: Circle faces a lawsuit for allegedly aiding in the transfer of $230 million in stolen USDC.…

Bitcoin Faces ‘Near-Term Selling Pressure’ Following Surge to $76K: CryptoQuant
Key Takeaways: Bitcoin reaches a multi-month high of $76,000, prompting increased deposits to exchanges. CryptoQuant identifies a peak…

Ethereum Foundation Unveils North Korean Infiltration in Web3
Key Takeaways: The Ethereum Foundation’s ETH Rangers program exposed 100 North Korean operatives infiltrating Web3 companies. The Ketman…

Crypto in Sustained Winter as CEX Volumes Drop 39% in Q1
Key Takeaways: Centralized crypto exchange trading volume fell by 39% in Q1 2026 to $2.7 trillion. March saw…

Bitcoiners Should Prepare for Quantum Computing Now, Urges Adam Back
Key Takeaways: Adam Back emphasizes immediate steps toward quantum-resistant solutions for Bitcoin. Quantum computing may disrupt blockchain security…

Cybersecurity Alert: Counterfeit Ledger Devices on Chinese Market
Key Takeaways: Scammers distribute fake Ledger devices via Chinese marketplaces, risking user crypto assets. Victims of a related…

Texas Individual Sentenced in $20M Meta-1 Coin Scam
Key Takeaways: Robert Dunlap sentenced to 23 years for his role in Meta-1 Coin fraud, misleading investors about…

Zanzibar police investigate crypto executive Joe McCann following fiancée’s death
Key Takeaways: Joe McCann, founder of Asymmetric, held for questioning by Zanzibar police after fiancée Ashly Robinson’s death.…

Latest Crypto Developments Expose Security Risks and Regulatory Challenges
Key Takeaways: The Ethereum Foundation’s Ketman Project unveiled 100 North Korean operatives in Web3, showcasing major security risks.…

Grinex Exchange Faces $14M Hack, Trading Suspended
Key Takeaways: Grinex, a Kyrgyzstan-registered crypto exchange linked to Russia, lost $13.7 million in an advanced cyberattack. US…

Ex-Treasury Chief Warns of US Treasury Market Crash and Calls for Debt Plan
Key Takeaways: Former Treasury Secretary Henry Paulson warns of a potential US Treasury market crisis and urges for…

At least 12 Crypto Protocols Targeted Post-Drift Hack
Key Takeaways: Over 12 crypto entities compromised within weeks after Drift Protocol’s $280 million breach. Recent hacks include…

With No Bipartisan Leadership, CFTC Won’t ‘Slow Down’ on Rulemaking
Key Takeaways: Michael Selig, the sole commissioner of the CFTC, will proceed with rulemaking despite calls for bipartisan…

Key Ethereum Researcher Josh Stark Departs Ethereum Foundation
Key Takeaways: Josh Stark, a crucial figure at the Ethereum Foundation, is leaving after five years, marking a…

Prediction Market Legal Struggles Could See Supreme Court Involvement
Key Takeaways: Kalshi’s appeal against Nevada’s ban on its event contracts could push prediction market regulation to the…
Tom Lee Says ‘Mini Crypto Winter’ Is Over, Sees Ether Above $60K
Key Takeaways: Tom Lee predicts Ether’s resurgence, projecting it to surpass $60,000 in the coming years. Bitmine suffered…
French Government Tackles Rising Crypto Safety Concerns
Key Takeaways: France is intensifying measures to counter the surge in crypto kidnappings and wrench attacks. Since early…
Europe’s Bitcoin Treasury Playbook Unlikely to Mirror US Strategy: PBW 2026
Key Takeaways: European firms are adapting unique Bitcoin treasury strategies due to distinct financial regulations and market dynamics…
Circle Confronts Lawsuit Over $280M Drift Protocol Hack
Key Takeaways: Circle faces a lawsuit for allegedly aiding in the transfer of $230 million in stolen USDC.…
Bitcoin Faces ‘Near-Term Selling Pressure’ Following Surge to $76K: CryptoQuant
Key Takeaways: Bitcoin reaches a multi-month high of $76,000, prompting increased deposits to exchanges. CryptoQuant identifies a peak…
Ethereum Foundation Unveils North Korean Infiltration in Web3
Key Takeaways: The Ethereum Foundation’s ETH Rangers program exposed 100 North Korean operatives infiltrating Web3 companies. The Ketman…













