Fintech News and Headlines
  • All Technology
  • AI
  • Autonomy
  • B2B Growth
  • Big Data
  • BioTech
  • ClimateTech
  • Consumer Tech
  • Crypto
  • Cybersecurity
  • DevOps
  • Digital Marketing
  • Ecommerce
  • EdTech
  • Enterprise
  • FinTech
  • GovTech
  • Hardware
  • HealthTech
  • HRTech
  • LegalTech
  • Nanotech
  • PropTech
  • Quantum
  • Robotics
  • SaaS
  • SpaceTech
AllNewsDealsSocialBlogsVideosPodcastsDigests
NewsDealsSocialBlogsVideosPodcasts
FintechNewsMariia Miakisheva: “The First Thing that Breaks Is the Assumption that the Data Is Already Comparable”
Mariia Miakisheva: “The First Thing that Breaks Is the Assumption that the Data Is Already Comparable”
FinTechBig Data

Mariia Miakisheva: “The First Thing that Breaks Is the Assumption that the Data Is Already Comparable”

•February 10, 2026
0
TechBullion
TechBullion•Feb 10, 2026

Companies Mentioned

Bank for International Settlements

Bank for International Settlements

Why It Matters

Without a single source of truth, tokenised settlements cannot deliver accurate reporting, exposing firms to operational risk and missed growth opportunities. Implementing data integrity first enables safe adoption of new payment rails and stronger financial governance.

Key Takeaways

  • •Data comparability assumption fails across multi‑rail payments
  • •Standardized reference data enables single source of truth
  • •Governance precedes tooling for reliable finance data
  • •Mapping transactions to P&L creates audit‑ready reporting
  • •Controlled capital allocation reduces risk in fast growth

Pulse Analysis

Tokenisation promises near‑instant settlement, but the real bottleneck lies in the data that feeds finance systems. Companies juggling bank feeds, payment providers, and crypto platforms often receive transaction records in disparate formats, with mismatched identifiers and inconsistent cut‑off times. This fragmentation prevents a unified view of cash positions, leading to duplicate entries, missing transactions, and unreliable profit‑and‑loss reporting. By establishing a common data schema—capturing essential fields such as entity, currency, counterparty, and purpose—organizations create a foundation for accurate, real‑time liquidity monitoring.

Once the data is standardized, the next critical step is governance. Defining ownership, data definitions, and reconciliation standards ensures that automation tools operate on consistent rules rather than amplifying inconsistencies. Mapping each transaction to a specific line in the management P&L provides traceability and auditability, turning raw cash movements into decision‑grade financial statements. This governance‑first mindset enables finance teams to shift from manual reconciliation to automated, near‑real‑time close cycles, supporting faster strategic decisions and reducing operational risk.

For tech leaders, the strategic implication is clear: prioritize data integrity before expanding payment rails. A controlled, single‑source finance data layer acts as a safety net, allowing firms to adopt new tokenised or crypto settlement solutions without compromising reporting accuracy. Moreover, applying the same disciplined portfolio management principles to capital allocation—treating investments as governed projects with staged funding—enhances risk visibility and aligns growth with cash flow health. In a landscape where speed and innovation are paramount, a robust data foundation becomes the competitive advantage that turns tokenisation from a theoretical concept into a practical, value‑adding capability.

Mariia Miakisheva: “The first thing that breaks is the assumption that the data is already comparable”

A finance transformation consultant and an international judge at the global Cases&Faces Awards 2025, explains how to make multi‑rail payment data decision‑ready

On June 24, 2025, the Bank for International Settlements outlined a blueprint for a tokenised “unified ledger,” arguing that tokenisation can combine messaging, reconciliation, and settlement and modernise cross‑border payments and market infrastructure. Yet inside companies, the immediate constraint is different: turning multi‑rail transaction data into a clean, provable view of liquidity and performance. When flows span multiple banks, payment providers, and crypto rails, transaction records often do not align across systems, which undermines reporting accuracy and cash visibility.

That is the problem finance transformation consultant Mariia Miakisheva helps companies solve. Her work has delivered decision‑grade visibility across multi‑jurisdiction cash flows, reduced reliance on manual processing, and strengthened control and auditability of reporting. She has also helped companies move from ad‑hoc spending to a controlled growth model that balances dividends with expansion, making investment risk measurable and transparent. In scaling finance functions, she has shifted teams from payment execution to management control, accelerating close, improving spending discipline, and reducing operational risk. Her perspective is also grounded in published research that other practitioners can apply, including the paper “Intelligent Cash Flow Management Systems In The Context Of Digital Transformation,” published in the scientific journal Economics of Sustainable Development, which outlines approaches to cash‑flow management systems in the context of digital transformation.

In this interview, we discuss how to make tokenized and multi‑rail payments decision‑ready and how to engineer an audit‑ready finance data layer that enables near‑real‑time liquidity and faster closes.


Mariia, as tokenised settlement pilots and unified‑ledger concepts move from theory to implementation, what do you see as the main gap between the concept and what companies can run reliably today?

The main gap is that public discussion focuses on settlement infrastructure, while companies still need reliable finance data for reporting and decisions.

Even with faster settlement, a business must consistently identify the counterparty and purpose of each transaction, capture fees and foreign‑exchange effects correctly, and reflect the movement of funds in management reporting across multiple entities and countries. In practice, the same transaction details arrive from different systems in different formats and with different timing. If those inputs are not standardized and controlled, the numbers will not match across reports and teams.

As a result, tokenized settlement can reduce transfer time, but it does not automatically produce accurate liquidity visibility or trustworthy profit‑and‑loss reporting.


In one of your projects, you consolidated cash movements across multiple countries, while a part of the flows ran through crypto infrastructure. What typically breaks first when a company tries to build a single source of truth in that setup?

The first thing that breaks is the assumption that the data is already comparable. Teams export statements from banks and crypto providers and combine them, but the same transaction is described differently across sources and time zones. Transaction identifiers do not match, naming is inconsistent, and timestamps follow different cut‑off rules. As a result, you get duplicates, missing transactions, and incorrect categorization.

To build a real single source of truth, you need standardized reference data for counterparties and categories, consistent classification rules, and a reconciliation process that verifies completeness and prevents double counting. Without these controls, finance cannot produce a reliable cash position or weekly cash flow, and management cannot use the numbers for decisions.


You built a finance data architecture that links transaction‑level cash movements to the company’s management Profit and Loss statement, which is the internal report showing revenue, expenses, and profit by the way leadership runs the business. How do you implement that link in practice?

I implement the link by standardizing the transaction data first, then applying consistent reporting rules.

  1. Capture – Every cash movement from banks, payment providers, and other rails is captured in a single dataset with the same required fields (entity, date‑time, currency, counterparty, purpose, etc.). Reference data is unified so counterparties and categories are consistent across all sources and countries.

  2. Map – We define mapping rules that assign each transaction to a management‑reporting category and to a specific line in the management P&L (revenue, cost of sales, operating expenses, fees, etc.). Fees and foreign‑exchange effects are calculated separately and placed in the appropriate line items.

  3. Reconcile – Reconciliation checks verify that the aggregated P&L can be traced back to the underlying transactions. This makes the internal P&L consistent, auditable, and fast to produce on a recurring basis.


CTOs and data engineers will ask the obvious question: is this mostly a tooling problem or a governance problem?

I think it is both, but governance comes first. You can build strong tooling—ETL pipelines that pull data from banks, payment providers, and crypto platforms through APIs—but the system will not be reliable if you have not defined ownership and accountability, data definitions, cut‑off rules, reconciliation standards, and the operating calendar for reporting and close. Once those rules are clear, tooling becomes highly effective because it automates consistent decisions. Without governance, automation only scales inconsistency, and the same transactions will be classified differently by different teams or at different times.


In one project, you helped a company shift from ad‑hoc spending to a controlled growth model, balance dividends with expansion, and make investment risk measurable through portfolio reporting and stage financing. What is the core takeaway for tech leaders?

The core takeaway is that capital allocation needs the same operating discipline as production systems. When you treat investments as a governed portfolio, you replace one‑off decisions with a repeatable process. Projects are defined with clear entry criteria, staged funding, and regular checkpoints. That makes risk visible early, because leadership sees the portfolio status in a consistent format and can compare initiatives on the same basis. In this case, that discipline protected the core business cash flow and prevented new initiatives from growing without limits. It also created a clear mechanism to balance dividend expectations with expansion, because funding decisions were tied to rules and portfolio health rather than pressure or optimism.


Your positioning is that finance should run in parallel with the business, not chase it after the fact. What does that look like week to week?

It means finance is present at the moment decisions are formed—not to block initiatives, but to model consequences early and adapt reporting logic before change hits the numbers. Practically, that involves management rhythms: weekly status cadence, closing checklists, variance reviews with root‑cause analysis, and reporting that founders can read without needing a finance translator. The goal is to replace intuition‑only decisions with measurable models, without slowing the business down.


You also scaled a finance function and reduced manual chaos by restructuring responsibilities and standardizing routines. Where do fast‑growing companies most commonly lose control?

They lose it in the gray zones: unclear ownership, inconsistent approvals, and fragmented recurring vs. ad‑hoc spend. I’ve seen teams drowning in manual registries because nobody defined stable processes for repeatable payments, reconciliations, and HR‑finance synchronization. When you introduce segregation of duties, predictable calendars, standardized templates, and automate typical checks, finance stops being “the payments desk” and becomes a control and decision centre.


If you had to give tech leaders one principle for the next 12 to 18 months, as tokenisation discussions accelerate, what would it be?

Don’t start with the rails. Start with data integrity. Whether money is moved through banks, payment providers, or crypto infrastructure, you need a single, controlled truth layer: reference data, classification rules, reconciliation, and cut‑off governance. If you build that foundation, you can adopt new rails safely. If you don’t, every new rail just multiplies reporting ambiguity and risk.

Read Original Article
0

Comments

Want to join the conversation?

Loading comments...