How Bad Customer Service Drives 85% of Consumers to Seek Chargebacks
Pymnts.com1 day ago
900

How Bad Customer Service Drives 85% of Consumers to Seek Chargebacks

CUSTOMER SERVICE TIPS
customerservice
frauddetection
ai
chargebacks
fintech
Share this content:

Summary:

  • 85% of consumers seek chargebacks due to poor customer service when merchants fail to resolve complaints quickly

  • AI innovations are key to balancing fraud detection with customer experience, minimizing false declines and friction

  • Agility without volatility is essential for resilient systems that adapt quickly without destabilizing portfolios

  • Explainable AI models must be traceable and fairness-tested to meet regulatory demands and build trust

  • Federated learning and data integrity protect privacy by avoiding personal data in training pipelines

In financial services, milliseconds can make or break both transactions and relationships.

The tension between speed, security and customer experience has long been the defining paradox of digital payments and their approvals. Fraud detection systems that lag by even fractions of a second risk alienating legitimate customers, while overly aggressive filters can trigger false declines that erode trust and revenue alike.

"The value of catching fraud is minimized if legitimate customers are caught in the net," Matthew Pearce, vice president of fraud risk management and dispute operations at i2c, told PYMNTS during a discussion for the B2B Payments 2025 event.

Pearce highlighted three key metrics i2c monitors most closely: fraud loss ratio, fraud decline rate and false positive rate. These indicators define the delicate equilibrium between vigilance and usability. The goal is to minimize both losses and friction, a harmony that requires constant recalibration.

Balancing Catch Rates and Frictions

Fortunately, the future looks promising; and it’s in large part thanks to innovations in artificial intelligence (AI). As agentic, generative and predictive AI become core to financial operations, their applications are proving that performance and accountability can coexist.

"Leading institutions measure performance across multiple dimensions and tune models continuously to maintain that equilibrium. ... Modern defense blends real-time anomaly detection with controlled retraining cycles," Pearce said, noting that agility can be a true differentiator.

The result is what he called "agility without volatility," or systems that evolve quickly enough to keep pace with fraudsters, but not so reactively that they destabilize existing portfolios.

"Agility without volatility is the new definition of resilience," Pearce said. "Adaptability matters as much as accuracy."

Building Explainability

At the same time, while AI has become indispensable in financial operations, it has also raised uncomfortable questions about trust and accountability. Regulators have sharpened their focus on 'black box' decision systems, demanding explainability in areas like credit scoring, dispute resolution and fraud detection.

For Pearce, these aren’t checkboxes; they’re design principles. At i2c, every AI model is versioned, documented and fairness-tested before deployment. When a regulator or client asks why a decision was made, the company can produce a clear narrative: the data lineage, rationale and governance path that led to that outcome.

"Every outcome [must] be traceable, from the features and rules behind it to the business impact that it creates," Pearce said. "We build explainability into the model and into the model lifecycle. It’s not an afterthought, it’s part of the process."

Key to this "full story" explainability is data. AI’s promise in payments is only as strong as the data that fuels it.

"We draw insights from a broad mix of transaction data, dispute outcomes and behavioral patterns," he continued. "Each dataset goes through schema checks, drift tracking and challenger testing before a model moves into production."

Federated Learning and Data Integrity

The key is federation: a local/global hybrid design maintains predictive power without overfitting to any single data source.

"Models learn from global trends, but adapt locally," Pearce said. "That lets us maintain performance accuracy without biasing the model to a single portfolio."

Equally important is what doesn’t enter the system. Personally identifiable information is never part of i2c’s training pipelines. Instead, the company tokenizes or hashes identifiers at the architectural level, ensuring that "models only ever see attributes relevant to prediction, not the customers behind them." When explanations are generated, Pearce said, they’re built from structured metadata, not raw personal details.

That kind of approach may soon become table stakes. As regulators like the Federal Reserve and the Consumer Financial Protection Bureau evolve their frameworks for algorithmic accountability, financial institutions will need systems that don’t just perform well, but that can show how they perform well.

"Transparency never becomes the cost of security," Pearce said. "Privacy protection really begins upstream."

From Pilot to Proof of Impact

Even the most sophisticated AI systems can falter without a clear implementation pathway. For banks and FinTechs, the challenge often isn’t what to build, but how to operationalize it.

"Effective AI adoption follows a disciplined 90-day cycle," Pearce said. "Scope and success criteria first, integration and configuration next, and then the limited rollout. ... The toughest barriers are not technical — they’re organizational. Governance, approvals, data quality and regulatory comfort often slow AI more than coding ever does."

The goal, he added, isn’t "proof of concept." It’s "proof of impact."

By shifting the focus from feasibility to results, i2c’s goal is to reframe AI not as an experimental venture but as a strategic asset.

It’s a distinction that can resonate with financial institutions seeking a clearer ROI on their digital transformation efforts and also informs why the integration of solutions like i2c’s is handled through APIs designed to coexist with legacy cores and CRMs, reducing the resource burden on clients.

"Client resources stay light," Pearce noted. "They have data access, compliance oversight and a technical liaison, while the provider shoulders the setup and governance."

His team’s work hints at what the next phase of FinTech evolution will look like: intelligent systems that are not only faster and more adaptive, but also more ethical and auditable.

Comments

0

Join Our Community

Sign up to share your thoughts, engage with others, and become part of our growing community.

No comments yet

Be the first to share your thoughts and start the conversation!

Newsletter

Subscribe our newsletter to receive our daily digested news

Join our newsletter and get the latest updates delivered straight to your inbox.

OR
CustomerRemoteJobs.com logo

CustomerRemoteJobs.com

Get CustomerRemoteJobs.com on your phone!