
Immuta launches Agentic Data Access module for AI agents
Immuta unveiled an Agentic Data Access module that lets autonomous AI agents retrieve enterprise data in real time while enforcing governance policies. The module treats agents as first‑class data users, applying least‑access and zero standing privileges and providing audit trails, all built on Immuta’s policy engine.

T‑Systems launched “Talk to your data,” an AI‑driven chat platform that connects disparate corporate data sources, searches, analyzes, and visualizes information on demand. The solution uses an ontology layer to map data across systems, enabling natural‑language queries. Pilot projects, including a public‑sector report and hospital use cases, cut analysis time from six hours to under an hour. The service runs on T‑Systems’ sovereign AI cloud, keeping data within a secure, private infrastructure.
The open‑source pg_duckpipe extension adds real‑time change data capture to PostgreSQL, continuously replicating heap tables into DuckLake columnar tables using logical WAL streaming. A single SQL call—duckpipe.add_table—starts the sync, and the solution works without Kafka, Debezium, or any external orchestrator,...
EDB’s Data Governance Co‑Pilot AI quickstart, built on Red Hat OpenShift AI and Postgres AI, embeds governance policies directly into the query generation process. By using the pg‑airman‑mcp server, user prompts are filtered through uploaded policy files, producing compliant SQL and...
Boomi unveiled a 2026 platform upgrade that pivots from AI pilots to "data activation," introducing the Meta Hub as a unified system of record to eliminate fragmented data. The new Agent Control Tower adds governance, session logs, and observability for...

Implementing data pipelines is essential for digital transformation and AI, yet teams repeatedly encounter vague requirements, poor data quality, scalability bottlenecks, orchestration complexity, and monitoring gaps. These challenges cause costly rework, downstream errors, and performance degradation. Solutions include detailed requirement...
CLI-first is eating development. Email, calendar—they all have CLIs now. Why not your business metrics? For data/analytics engineers building with agents: DuckDB + MotherDuck + Rill give you an agentfriendly, #localfirst frontend—exact context via SQL and YAML. https://www.rilldata.com/blog/building-an-agent-friendly-local-first-analytics-stack-with-motherduck-and-rill
Point‑of‑sale systems are evolving from simple cash registers into real‑time, connected platforms that handle payments, inventory, and customer insights. Mobile payment leaders Square, SumUp, and Shopify now offer SMBs enterprise‑grade POS capabilities, blurring the line between payment processors and commerce...

Zalando, which generates roughly €3 billion in quarterly fashion sales, ran into soaring AWS costs and unstable Flink clusters due to the way Flink 1.20’s Table API handled chained joins. The joins caused state to balloon to over 240 GB per application, leading...

Enterprises are grappling with fragmented data landscapes, prompting a surge in data catalog adoption. Modern catalog tools not only inventory metadata but also embed AI, generative AI, and natural‑language interfaces to accelerate discovery and governance. The article lists 15 leading...

Understanding #Data Fabric is Key to Modern Data Management and Efficiency by @antgrasso #DataScience #BigData https://t.co/6OxSioKNji

The post details a new Kafka‑based log pipeline that guarantees exactly‑once processing, eliminating duplicate handling even during failures. It combines idempotent producers, transactional consumer commits, a Redis‑backed deduplication layer, and a state‑reconciliation service to create an end‑to‑end exactly‑once flow. The...
CDO Retail Exchange 2026 convenes 70 senior retail data, analytics and AI executives from brands such as Adidas, IKEA and Shiseido. The closed‑door forum is designed to move AI projects from pilot to profit, focusing on real‑time decisioning, margin‑boosting use...
Liquid Clustering is a Delta‑Lake layout strategy that dynamically groups rows by query‑driven columns instead of static folder partitions. By continuously reorganizing files, it makes file‑level statistics more useful, enabling stronger data skipping and smaller scan footprints. Engineers enable it...
Confluent executives warned that AI initiatives will falter without fresh, governed data in motion, shifting focus from model perfection to real‑time data architecture. They described a transition from batch‑based business intelligence to continuous, autonomous AI that requires millisecond‑latency streams and...
Dagster's key innovation is software-defined assets. Instead of: "Run this job on a schedule" You declare: "I need this table to exist, here's how to build it" The difference is subtle but profound. Assets have identities, dependencies, and history. Jobs are just tasks. When...

The healthcare data landscape is finally moving from three‑decades of batch ETL to event‑driven pipelines powered by Kafka, Flink and modern cloud services. Legacy systems were built around billing cycles, leaving clinicians without real‑time data for urgent decisions. Recent API...

The data industry is rapidly converging on open standards, and dbt Labs is leading the charge by migrating its entire data stack to an Iceberg‑based lake that supports multiple compute engines. In a recent podcast, Anders Swanson outlined the current...
Snowflake has expanded its Cortex Code CLI, an AI‑driven coding agent, to support the open‑source data‑pipeline frameworks dbt and Apache Airflow. The extension leverages Anthropic’s Agent Skills to automate debugging, testing, and optimization of pipelines, and is offered through a new...

In this episode, Anders Swanson, a developer experience advocate at dbt Labs, walks through the current state of the Apache Iceberg ecosystem, covering how open‑source and cloud vendors are converging on shared standards, the rise of external catalog integrations, and...
Bruce Momjian delivered two recent talks to the PostgreSQL community: a deep‑dive on the write‑ahead log (WAL) at the Scale conference on March 7, 2026, and a candid assessment of PostgreSQL’s missing features at Prague PostgreSQL Developer Day on January 28, 2026....
Imagine a place where you could: • Pick a data project • Follow a structured workflow • Build something real • Add it straight to your portfolio That's the direction we're exploring.
The article argues that data quality improvements don’t require top‑down mandates; engineers can start fixing messy source data by writing tests, documenting issues, and building simple dashboards. By turning test failures into evidence, teams persuade source‑system owners to add validation,...

Grupo Financiero Banorte teamed with Hitachi Vantara to relocate its primary data center from Mexico City to Querétaro, moving 450 TB of information in under an hour. The migration introduced two mainframes, three Hitachi storage arrays, and the Virtual Storage Platform...

Amazon Web Services is poised to invest $750 million in a new data center on a 99‑acre site in Clinton, Mississippi, repurposing the former Milwaukee Tool facility. The city council approved a fee‑in‑lieu tax arrangement, though final approval from the Mississippi...

MinIO has launched AIStor Table Sharing, embedding the Delta Sharing open protocol directly into its AIStor object store. The feature lets enterprises expose on‑premises data to Databricks in real time, eliminating the need for costly data replication. Built on Iceberg...

In this episode the hosts explore whether a true single source of truth (SSOT) for construction project data is achievable or merely aspirational. NuFORMA’s Dave Wagner and Carl Beillette argue that a single vendor solution is unrealistic; instead, the goal...
Panome Bio, a multi‑omics contract research organization, unveiled an exposomics service platform that pairs untargeted Discovery Exposomics with targeted quantification of priority chemicals. The Discovery workflow leverages the MassID™ engine and a 32,000‑compound database to profile environmental exposures without prior...
I spent years working with data warehouse automation tools before the modern data stack existed. The biggest lesson? There are two approaches to generating pipelines: Parametric - you define parameters, the tool generates SQL Template-based - you write SQL templates with variables Most modern...
Generative AI now turns dense, unstructured corporate text—especially 10‑K Item 1 disclosures—into structured, decision‑ready metrics. Researchers fine‑tuned a GPT model on 3,500 labeled sentences and applied it to nearly 10 million sentences from 39,710 filings, creating a climate‑solution intensity score for 4,483...

Part 3 of 3 underused chart types worth knowing. A box plot with 15 points looks identical to one with 1,500. You lose all sense of where measurements actually cluster. Beeswarm plots fix this. Every data point is visible. Nothing gets absorbed into...
Slack is the most important text data source in most companies, but it has the worst data access policies in enterprise software. The only thing that will fix it is competition, and Anthropic is the right company to do it....
Seqster has introduced 1‑Click DataLake, a real‑world data platform that aggregates anonymized electronic health‑record information from over 150 million patients and 200,000 clinicians across the United States. The solution delivers real‑time, longitudinal patient journeys to speed trial design, feasibility assessments, and...

Question for your next meeting: "If 95% of AI projects fail before production, and the reason is data quality, what percentage of our AI budget goes to data quality and governance?" The follow-up that makes it uncomfortable: "How confident are we that...
Most FP&A teams don’t struggle with analytics. They struggle with data. 💡Finance leaders from PepsiCo, BILL, and Workday shared how they build strong data foundations and a single source of truth to enable AI and predictive decision-making: https://t.co/FnD9BnrjT6 #fpatrends

Rail planning teams often add new data feeds that become extra log‑ins and reconciliation chores, leaving planners to rebuild spreadsheets for every decision. The article argues that a dedicated business intelligence (BI) layer, placed atop existing asset stores, can turn...

Validio, a Stockholm‑based data‑quality automation startup, secured $30 million in Series A funding, bringing its total capital to $47 million after an 800 % ARR surge last year. The round was led by Plural with participation from Lakestar, J12 and several angels. Validio’s AI‑driven...

Nauta’s AI‑native operating system overlays existing ERP, TMS and WMS platforms to turn fragmented supply‑chain data into a single, live source of truth. By ingesting emails, PDFs and spreadsheets, the platform eliminates “data graveyards” and delivers SKU‑level visibility and automated...

The Association of State and Territorial Health Officials (ASTHO) announced a new public‑health data consortium, partnering with Veritas Data Research and HealthVerity to create a secure data exchange for state and territorial health agencies. The effort seeks to integrate real‑world...

Validio announced a $30 million Series A round led by Plural, bringing total funding to $47 million after an 800 % revenue surge. The Stockholm‑based startup offers an automated data‑quality platform that monitors billions of records, detects anomalies, and maps lineage in days rather...
Enterprises are shifting from static data warehouses to a data supply chain model that manages information as a continuous, end‑to‑end flow. The framework defines stages—ingestion, transformation, storage, distribution, and consumption—optimizing each to support AI, analytics, and real‑time insights. By integrating...

Orange Wholesale CEO Michaël Trabbia told MWC that the French telco will not sell its roughly 75 data‑centre assets across Europe, Africa and the Middle East. Instead, Orange plans to monetize the facilities by expanding colocation services for enterprise customers,...

Shylaja Nathan, former senior vice president of architecture at Fidelity, joins Forrester as a principal analyst focusing on enterprise data and AI strategy. Drawing on more than two decades of experience modernizing data platforms for major financial institutions, she stresses...

Tableau is about to die. Introducing PandasAI, a free alternative for fast Business Intelligence. Let dive in:

Many digital‑twin projects stall after pilot phases because they lack a trusted data foundation. At a recent DBTA webinar, Informatica’s Christian Farra explained that integrating master data and contextual information is essential to turn raw sensor signals into actionable insights....

BlueBox Systems unveiled Tradelane Intelligence, a data‑analytics platform that merges AI‑validated airfreight data with premium ocean data from Vizion. The solution delivers advanced reporting tools for carrier comparison, demurrage alerts, document verification, and an Eco‑Routing module that projects CO₂ emissions....

WisdomAI, an AI‑native business intelligence startup, announced the launch of its Federated Agentic Intelligence platform, shifting its focus from passive insights to autonomous enterprise execution. The platform combines an Enterprise Context Layer, a Model Context Protocol client, and an Adaptive...

Orizon Aerostructures has deployed Flexxbotics’ autonomous manufacturing platform to create a data‑driven, closed‑loop control environment across its aerospace production lines. The integration links CNC machines, FANUC robots, and enterprise PLM systems, feeding multimodal sensor streams into industrial AI for real‑time...

Codelco, the world’s largest copper producer, has signed an 18‑month collaboration framework with Microsoft to embed artificial intelligence, advanced analytics, automation and digital security into its mining operations. Building on a 27‑year partnership, the deal will evaluate joint initiatives, pilot...
Why the most valuable AI systems are not the most accurate ones today, but the ones designed to learn tomorrow In the early days of enterprise AI, success was measured in a single moment: the model launch. A team would...
AWS and Azure both surging simultaneously. Oracle climbing. Elasticsearch tripled. It's not one cloud winning. It's ALL infrastructure growing as data demand outpaces capacity. The foundation layer is on fire.