
Immuta unveiled the first data provisioning platform designed to manage AI agent access, treating agents as distinct identities with attributes, intent, and audit trails. The Agentic Data Access feature grants just‑in‑time, temporary roles on cloud data warehouses such as Snowflake, Databricks and BigQuery, eliminating standing privileges and manual ticket workflows. By centralizing policy enforcement, Immuta provides real‑time, auditable access while preventing account sprawl and rights inflation. The roadmap includes semantic governance and agent‑initiated access requests, extending control beyond simple provisioning.

anynines unveiled its open‑source Klutch control plane at KubeCon EU, positioning it as the core of the a9s Hub framework for data‑service orchestration across on‑premises and AWS environments. The solution lets platform teams expose databases, object storage and caches through...

Maryland has intensified data‑driven decision making under Governors Larry Hogan and Wes Moore, with Chief Data Officer Natalie Evans Harris describing a statewide "culture shift" toward breaking data silos. The state is building a centralized governance structure and an enterprise...

Informatica and Snowflake partnered in a DBTA webinar to showcase how metadata‑driven governance, data quality and observability can make Snowflake’s AI Data Cloud AI‑ready. The discussion highlighted Informatica’s end‑to‑end data management capabilities, including tag‑based PII masking, automated semantic classification and...

Adactin unveiled AFIVE, an AI‑powered knowledge platform built on Microsoft Azure OpenAI and AI Foundry. It uses retrieval‑augmented generation with LangChain to pull data from SharePoint, Google Drive, Azure Blob Storage and Dropbox. The solution offers natural‑language queries, integrates with...

Frontline child protection workers in New Zealand face growing caseloads, time pressure and fragmented information, making high‑stakes decisions about child safety and family intervention. Predictive modelling, which analyses large administrative datasets to generate risk scores, has been explored for over a...

The Square Kilometre Array Observatory (SKAO) will soon produce up to 60 exabytes of raw data annually, dwarfing the 700‑petabyte baseline currently planned for storage. Scientists are forced to discard raw observations once processed images meet quality thresholds, a practice...

Snowflake’s SnowConvert AI offers an end‑to‑end, AI‑driven solution for migrating Amazon Redshift workloads to Snowflake. It begins with an automated assessment that maps objects, gauges conversion complexity, and creates structured migration waves. The platform then converts SQL and procedural code...
Modern data pipelines face growing data quality challenges that go beyond simple schema checks, as subtle semantic drift and incomplete datasets can silently degrade analytics. Current deterministic quality frameworks rely on static rules and thresholds, which become noisy and costly...

Enterprises are rapidly replacing legacy data architectures with an AI‑ready modern data stack as AI initiatives surge. Deloitte’s 2026 survey shows strategic AI readiness rose to 42%, but confidence in data‑management capabilities slipped to 40%, while an IDC study found...
Databricks is rebranding Delta Live Tables as Lakeflow Spark Declarative Pipelines, adding open‑source Spark alignment and new features. Existing DLT pipelines run unchanged, but Databricks recommends updating imports, decorators, expectations, and CDC logic to the new `dp` API. The migration...

Smart organizations leverage big data to boost performance, but without a clear strategy they risk duplicated projects, compliance breaches, and wasted spend. The article outlines a four‑step framework—defining business goals, assessing data readiness, prioritizing use cases, and creating a flexible...
LightningChart unveiled Dashtera, a no‑code, web‑based analytics platform that leverages GPU‑accelerated rendering to display up to 100 million data points in real time. The solution removes the need for extensive implementation projects, data reduction, or custom integration, delivering instant zoom and...

Informatica announced general availability of Microsoft Fabric Open Mirroring within its Intelligent Data Management Cloud (IDMC) and launched a new Azure‑based IDMC delivery point in Switzerland. The Open Mirroring feature lets customers synchronize data between OneLake and Fabric Data Warehouse...

Booking.com’s data and machine‑learning platform, led by Huy Dao, has completed a seamless migration from on‑prem Hadoop to a Snowflake‑based cloud ecosystem. The new Booking Data Exchange serves over 1,500 practitioners, handling petabytes of data and billions of daily predictions...
At SAPinsider Las Vegas 2026, Ingo Hilgefort warned that data‑driven AI projects fail when organizations lack trust in their data. He argued that inconsistent definitions and poor governance cause users to rebuild dashboards to verify numbers, stalling analytics adoption. Hilgefort...

Rare Hope, a nonprofit focused on rare‑disease hypotheses, adopted Cloudera’s hybrid data‑and‑AI platform to turn unstructured research papers and medical images into structured insights. Using PySpark pipelines, the organization extracts disease‑drug correlations and feeds them to large language models for...

The federal government is accelerating its adoption of generative AI, retrieval‑augmented generation, and early agentic systems, but agencies are constrained by legacy data architectures. Dell’s AI data platform offers a secure, federated foundation that lets classified and regulated data remain...

Utilities are grappling with an "IoT firehose" as smart meters generate massive, continuous telemetry streams. To tame the volume, they are adopting cloud‑based DataOps frameworks that automate ingestion, normalize data, and deliver analytics‑ready datasets at scale. Automated, event‑driven pipelines enable...

Microsoft unveiled Database Hub, an early‑access tool built on the Fabric data platform that consolidates management of Azure SQL Server, Cosmos DB, PostgreSQL, MySQL, Azure Arc‑enabled SQL, and other services. The hub offers a single pane of glass for on‑premises,...

Lloyd’s Register and OneOcean released a report warning that the maritime sector’s surge in operational data is hampered by fragmentation and low standardisation, jeopardising compliance and commercial advantage. Their Digital Maturity Index shows data standardisation at 2.45 / 4 while overall digital...

Oracle announced the general availability of Oracle Analytics Server 2026, delivering a suite of enhancements aimed at boosting adoption, performance, and governed self‑service. New defaults for the "Limit Values By" filter and a redesigned State menu streamline workbook interactions. The...

GHD has appointed David McLaren as its Enterprise Data & AI Leader, based in Toronto. McLaren brings experience from Coca‑Cola Canada Bottling, where he built enterprise‑scale data platforms, automation and governance. At GHD he will steer the development of an...

Enterprises are increasingly recognizing that knowing where data resides is insufficient without visibility into its lifecycle. Data lineage—tracking origin, transformations, and access—provides the transparency needed for accountability, data quality, compliance, and reduced technical debt. The article highlights how poor lineage...
Current retrieval‑augmented generation (RAG) systems were built for static document search, which creates consistency problems when multiple agents write concurrently. Without transactional control, memory updates can become partially committed, leading to answer drift and silent corruption. The article proposes using...

Enterprises must move beyond static data catalogs toward a universal AI catalog that combines a business‑friendly semantic layer with cross‑platform interoperability. The semantic layer supplies machine‑readable context, preventing misinterpretations by AI agents, while universal interoperability ensures governance, security, and metadata...

Databricks and Accenture have launched the Accenture Databricks Business Group, a joint venture designed to accelerate enterprise adoption of the Databricks Data Intelligence Platform for AI and data workloads. Backed by more than 25,000 Databricks‑trained professionals, the group will help...

Investments in data platforms have shifted from siloed warehouses to unified, sovereign foundations as agentic AI collapses analytics, operations, and AI into single workflows. Enterprises now need platforms that govern operational execution, high‑concurrency analytics, and AI reasoning together, rather than...
The Better Cotton Initiative (BCI) is launching a $200,000 on‑farm data‑collection effort in partnership with the Soil Health Institute and ag‑tech provider Growers Guide. The program will analyze soil, plant tissue and sap samples across the Southeast and other Cotton Belt...

GigaOm released version 6 of its Unstructured Data Management Radar, expanding the vendor set to 23 and appointing James Brown as the new analyst. The report reclassifies 11 suppliers as leaders and 12 as challengers, with notable moves such as Panzura shifting...
Xata built a product analytics warehouse using vanilla Postgres, consolidating identity, usage, billing, and event data from four separate systems. They employed materialized views, pg_cron schedules, and database branches to flatten JSONB events, refresh data daily, and iterate safely on...
Microsoft’s Planetary Computer offers a free, standards‑based geospatial data platform that aggregates curated datasets from government, academic and commercial sources. It provides STAC‑compatible APIs, Python and R SDKs, and an Explorer UI for rapid prototyping of environmental applications such as...

Coles Group has deployed an enterprise‑wide data streaming platform built on Confluent Cloud, unifying its real‑time data pipelines under a single Apache Kafka foundation. Previously, isolated event‑streaming stacks created silos, inconsistent models, and governance challenges. The new "enterprise event platform"...
IBM expanded its partnership with Nvidia at GTC 2026 to address enterprise AI data management challenges. The collaboration integrates Nvidia’s cuDF toolkit with IBM’s Presto query engine and adds Nemotron models to IBM’s Docling PDF reader. Nvidia GPUs will also power...
Wix.com has built a real‑time online feature store using Apache Kafka and Apache Flink to power personalized recommendations for its 200 million users. The architecture streams over 70 billion events per day through 50 000 Kafka topics, with FlinkSQL performing low‑latency transformations and...

digna announced a twelve‑month enterprise data‑warehouse deployment that operated without any traditional, manually coded data‑quality rules, relying instead on AI‑driven anomaly detection. The platform replaced thousands of null checks, threshold controls, and custom SQL assertions with statistical learning models that...

Sema4.ai announced the general availability of its AI‑powered Semantic Layer at the Gartner Data & Analytics Summit 2026. The platform lets business users query databases, spreadsheets and documents using plain English, eliminating the need for SQL expertise. It couples a...

Berlin‑based Tower announced a €5.5 million raise across pre‑seed and seed rounds, led by DIG Ventures and Speedinvest. The startup offers a unified storage‑compute platform that lets data engineering teams retain full data ownership while accelerating AI‑driven pipeline development. Leveraging Apache...

Companies are rapidly expanding analytics and AI capabilities, but a new Info‑Tech Research Group study reveals that low data trust is throttling expected business value. Fragmented ownership, inconsistent validation and reactive cleanup dominate current data practices, leading to underperforming analytics...

The article demonstrates how to use the sqlpackage command‑line utility to detect schema drift between Azure SQL databases by comparing a DACPAC file against a target database and generating a delta script. It outlines a lightweight, scriptable workflow that avoids...

Big data delivers eight strategic benefits for businesses, from deeper customer insight to real‑time decision making. By integrating diverse data sources—clickstreams, sensor feeds, social media—companies can personalize experiences, sharpen market intelligence, and streamline supply chains. Advanced architectures like lakehouses enable...

Denodo announced the release of Platform 9.4, a logical data management solution designed to accelerate trusted AI across enterprises. The update adds native vector‑database connectivity, embeds the Model Context Protocol for governed AI data access, and introduces a Lakehouse Accelerator powered...

Alation has launched outcome‑based governance, a system that replaces manual data‑governance processes with an agent‑driven operating model. The new Curation Automation feature, now generally available, automatically enriches and enforces metadata standards across the Alation platform. Organizations can declare business outcomes—such...

Cole Bowden’s DBTA webinar warned against over‑engineered data stacks and advocated a pragmatic approach to time‑series workloads. He urged firms to first assess whether data fits in memory or on a single drive before adopting a specialized database. When scale...
A unified, domain‑aware anomaly detection pipeline maps retail transaction and network traffic streams to a common event schema, enabling real‑time monitoring of rare, high‑impact events. The approach extracts temporal features (e.g., time‑since‑last‑event) and contextual typicality without data leakage, then trains...

ECDB, founded in 2022, delivers transaction‑level e‑commerce market intelligence by processing more than 1 billion purchases each month—about 1‑2% of global online sales. Its platform normalises and enriches this data to provide near‑real‑time visibility across categories, retailers and markets. Retailers using...
Birdzi introduced AskKea, a generative business intelligence assistant that lets grocery retailers ask plain‑English questions and receive decision‑ready answers, visualizations, and exportable data within minutes. The tool integrates structured and unstructured retail data, offering cross‑system insights such as category penetration,...

DataStrike announced an expansion of its Microsoft Fabric services, targeting organizations that are adopting the unified analytics platform. The new portfolio includes a two‑week Fabric readiness and proof‑of‑concept engagement, end‑to‑end migration assistance, and 24/7 managed operations. Services span OneLake, lakehouse...
At HIMSS26, Sequoia Project’s Didi Davis unveiled the second USCDI v3 data‑usability guide, expanding coverage to all data classes and emphasizing provenance, traceability, and persistent identifiers. The 60‑page guide outlines use cases across provider‑to‑provider, provider‑to‑public‑health, and provider‑to‑consumer exchanges, aiming to curb...
Graph databases are emerging as essential infrastructure for enterprise AI, offering a way to map relationships that reduces hallucinations, improves explainability, and enforces data governance. Neo4j’s CEO Emil Eifrem highlights that knowledge graphs give LLMs transparent access to corporate data,...