Havas Media Network North America has hired Sharona Sankar‑King as chief data and product officer to steer its proprietary AI platform, Converged.AI, and the broader analytics practice. Sankar‑King arrives from Harte Hanks after more than 25 years in agencies, consultancies and marketing services, including senior roles at Bain, BBDO and GroupM. She will oversee a platform that connects 23,000 employees and powers tools like the no‑code AVA workflow builder. Her mandate is to scale intelligence‑led solutions and address fragmented data and workflow challenges.

The Modern Analytics for Roadway Safety (MARS) Coalition is urging Congress to modernise federal road safety programs by adopting AI, telematics and predictive analytics. These technologies allow agencies to spot crash risks before they materialise, moving from reactive to preventive...

Capgemini has joined OpenAI’s newly launched Frontier Alliance as a founding partner, creating a dedicated delivery function to scale AI agents for enterprises. The firm will deploy OpenAI‑certified professionals to tackle data readiness, integration, operating‑model design and governance challenges. Capgemini...

The article shows how to locate and list duplicate rows in a SQL Server table using a Common Table Expression (CTE) that groups all columns and counts occurrences. It presents two queries: one that returns only unique rows (order_count = 1) and...
PostgreSQL administrators frequently encounter zombie sessions—backend processes that remain active or idle in transaction after a client vanishes. Linux’s default TCP keepalive interval of two hours lets these dead connections retain locks and block vacuum, inflating the process list. The...

The OECD’s 2025 Digital Government Index (DGI) places South Korea at the top with a 0.95 composite score, followed by Australia (0.88) and Portugal (0.86). Korea is the only nation to break the 0.9 threshold across all six assessment categories,...

Gartner released its inaugural Magic Quadrant for Decision Intelligence Platforms, signaling a shift from data‑driven to decision‑centric strategies. The report highlights legacy players like FICO alongside newer pro‑code solutions such as Quantexa, and notes that generative AI integration remains early....
India’s National Data and Analytics Platform (NDAP) will undergo a major revamp as NITI Aayog seeks a private‑sector partner to redesign, operate and hand over the system. The upgrade aims to handle vastly larger data volumes, add advanced analytics and...

Location intelligence is moving from a background reporting tool to a strategic asset as businesses combine geographic data with operational metrics. By layering spatial context onto demand, infrastructure and behavior datasets, firms uncover patterns that traditional analytics miss. AI and...
Data analytics is reshaping risk assessment from a reactive practice into a predictive science across finance, insurance, healthcare, and transportation. Predictive modeling, machine‑learning, and real‑time dashboards now enable firms to forecast exposure, micro‑segment customers, and allocate capital with greater confidence....

Macquarie Asset Management’s Asia‑Pacific Infrastructure Fund 4 has teamed with South Korean IT firm Gabia and its network subsidiary KINX to launch a $420 million hyperscale data‑center venture. The joint‑venture will initially build a 40 MW facility in Ansan, Seoul, and aims to...

Google, via shell company Sharka LLC, filed to build a fifth data center on its Midlothian, Texas campus. The $880 million project will span 288,000 sq ft and is slated for completion by February 24, 2027. This addition follows a $100 million fourth building announced in...

The article compares Tonic Structural and Informatica for test data management, highlighting that both generate privacy‑safe data but differ in deployment models and feature focus. Informatica is shifting to a cloud‑first strategy after its Salesforce acquisition, limiting on‑premises options, while...

Coforge has launched Data Cosmos, an AI‑enabled, cloud‑native data engineering and analytics platform designed to unify fragmented enterprise data. The solution is organized into five portfolios—Supernova, Nebula, Hypernova, Pulsar, and Quasar—that address modernization, governance, DataOps, and GenAI adoption across the...

The Public Accounts Committee has labeled the National Savings and Investments (NS&I) digital modernisation a “full‑spectrum disaster” after four years of a £3 bn programme that lacks an integrated plan, has seen costs triple and deadlines disappear. Parliament found the project...

The aviation sector is moving from isolated legacy systems to open‑architecture platforms that enable real‑time data sharing among air traffic control, airlines, and airports. Searidge Technologies, a NATS subsidiary, showcased its Chorus platform powering tools like Intelligent Stand Manager, which...
LiveRamp announced that third‑party AI agents can now plug directly into its data collaboration platform, removing the need for custom API calls. The integration enables agents to automate audience planning, segmentation, measurement and to interact with partner and proprietary agents....

The MarkTechPost tutorial walks through building a production‑style analytics and machine‑learning pipeline with Vaex on a synthetic 2 million‑row dataset. It showcases lazy feature engineering, approximate city‑level aggregations, and seamless integration with scikit‑learn via Vaex‑ML. The guide also demonstrates model training,...
Do it Best Group has launched Retail Pulse, a data‑driven platform that transforms independent hardware dealers’ POS and purchasing data into clear, actionable insights. By aggregating more than 1,000 member datasets, the tool creates tailored peer groups and highlights opportunities...

Infosys announced the completion of a large‑scale data modernization program for CSX Corporation, deploying its AI‑first Topaz platform built on Microsoft Fabric and Purview. The effort consolidated CSX’s fragmented data landscape into a unified cloud‑native environment, creating over 170 data...

Snowflake expanded its Cortex Code CLI to run in local environments, enabling AI‑assisted coding across dbt, Apache Airflow and other non‑Snowflake data sources under a subscription model. London‑based Cristie Software introduced FSBlocker, a lightweight kernel driver that locks down files...

The DBTA webinar highlighted that 85% of subscribers plan to modernize data platforms by 2025, driven by the rapid rise of GenAI and large language models. Vendors such as Informatica, Dataiku, Qlik and CData outlined a shift toward modular, AI‑driven...

JPMorgan’s global head of credit trading, Sanjay Jhamna, says generative AI will overhaul credit trading by efficiently processing the asset class’s massive unstructured data. He described credit markets as the last frontier for automation, noting that conventional AI models have...

The article demystifies database keys, contrasting natural keys—business‑meaning values—with surrogate keys that are system‑generated identifiers. It outlines why surrogates are favored for stability, compactness, and predictable performance, while also noting scenarios where natural keys or composite junction keys are preferable....

Yonyou released its Large Ontology Model (LOM) on February 24, a 4‑billion‑parameter AI that shifts enterprise data from static tables to a dynamic knowledge‑graph architecture. The model automates multi‑source ontology construction and delivers multi‑hop reasoning across procurement, production, sales and...

Druva has introduced Dru MetaGraph, a graph‑database layer that stores backup metadata as interconnected nodes, enabling AI agents to answer security and compliance questions with real‑time context. The approach stems from three drivers: security queries are fundamentally relationship‑based, customers need instant,...
The buyer’s guide evaluates the five dominant cloud data platforms—Databricks, Snowflake, Amazon Redshift, Google BigQuery, and Microsoft Fabric—highlighting their architectures, AI integrations, deployment models, and pricing structures. Databricks champions the lakehouse model with generative AI and open formats, while Snowflake...

Amazon Web Services’ ME‑CENTRAL‑1 region in the United Arab Emirates experienced an Availability Zone outage after unidentified objects struck the data center, igniting a fire and prompting emergency power shutdown. The incident coincided with a wave of Iranian missile and...

Big data is reshaping real estate by giving developers, agents, and investors real‑time demographic, economic, and environmental insights. Over 80 % of agents now use AI‑driven tools, and predictive analytics enable precise scenario modeling for pricing, density, and amenities. The technology...

Databricks launched Zerobus Ingest, a fully managed serverless streaming service that moves data directly into Delta Lake tables. The platform streams data from sources such as manufacturing systems, financial trading apps, IoT devices, and cybersecurity tools. It promises sub‑five‑second latency,...
The article outlines how Azure Databricks and Azure Machine Learning can be tightly integrated to create a unified intelligence pipeline. Databricks handles large‑scale data ingestion, cleaning, and feature engineering using Spark and Delta Lake, while Azure ML supplies model versioning,...
At last year’s CIO Summit in Mumbai, senior leaders from banking, fintech, telecom and manufacturing debated the growing risk profile of open‑source databases, with PostgreSQL emerging as the focal point. The conversation has moved from pure performance to trust, encompassing...

Emerald Intelligence has introduced Embedded Analytics, a new SaaS feature that provides real‑time, macro‑level dashboards for the licensed cannabis and hemp market. The initial release includes four interactive dashboards covering state sales, company leaderboards, product sales, and store status, all...

Accurate B2B data appending is a strategic lever that drives sales and marketing performance. Companies that rely on internal teams often face technical, resource, and compliance hurdles, leading to stale or incomplete records. Partnering with specialized data‑append providers delivers fresh,...

Companies face mounting sustainability regulations and consumer scrutiny, yet their legacy supply‑chain systems hold fragmented, inconsistent product data. The article outlines five actions—gaining product visibility, feeding tools with clean inputs, extending traceability beyond distribution, building compliance‑ready data infrastructure, and treating...

Germany’s Mobility Data Space (MDS) and the pan‑European Data for Road Safety (DFRS) consortium have signed an agreement to exchange safety‑related traffic data from connected vehicles across the EU. The partnership enables near‑real‑time sharing of sensor‑derived incident information, supporting the...
The article argues that AI readiness starts with robust SQL Server schema design, not with machine‑learning models. It highlights that stable, non‑recycled primary keys, preserved historical records, and clear audit columns are essential for future feature engineering. By separating raw...

PostgreSQL’s query planner relies on catalog statistics from pg_class and pg_statistic to estimate costs. When these statistics become stale—due to bulk loads, schema changes, or insufficient vacuum—the planner can choose inefficient plans, turning milliseconds queries into minutes. The article explains...
A recent benchmark shows that standard Python UDFs in PySpark dramatically slow pipelines because each row must be serialized to a Python worker. Using Pandas (vectorized) UDFs cuts execution time by roughly fourfold by leveraging Apache Arrow’s columnar transfer. Native...

The Dunkirk Port Authority has opened a 21‑hectare brownfield site for a potential AI‑focused data center, offering developers a power connection ranging from 400 MW to 700 MW. Power will be supplied by RTE from the nearby Flanders Maritimes substation, with a...

An upcoming research initiative will evaluate digital‑twin technology for data centers, aiming to identify high‑ROI use cases that surpass basic spreadsheet analysis. The study will assess available solutions, pinpoint scenarios—such as infrastructure vendor selection—that deliver quick, measurable value, and define...

Bindplane announced native destinations for the VictoriaMetrics ecosystem, allowing users to route OpenTelemetry metrics, traces, and logs directly to VictoriaMetrics, VictoriaTraces, and VictoriaLogs. The integration provides vendor‑neutral, OpenTelemetry‑native pipelines that eliminate manual exporter configuration and mitigate collector drift. It also...

Confluent Intelligence has introduced Streaming Agents, built on Google’s Agent2Agent protocol, to enable AI agents to share real‑time context and collaborate across platforms. The preview feature connects data sources such as BigQuery, Databricks, Snowflake and LangChain to third‑party systems like...
Tomas Vondra revisits PostgreSQL's long‑standing default of random_page_cost = 4.0, showing that modern SSDs make random I/O far more expensive than the parameter suggests. By timing sequential and index scans on a 4.4 GB table, he derives a random_page_cost of roughly 25‑35 on...

Security firm Truffle Security revealed that publicly exposed Google API keys can be upgraded to full‑access Gemini credentials, enabling data exfiltration from any organization using them. A November scan uncovered 2,863 such keys, affecting major banks, security vendors, and even...

An AI proof of concept (POC) is a focused, short‑term project that validates technical feasibility and business value before full‑scale investment. Costs vary widely, driven primarily by data readiness, problem complexity, integration needs, and infrastructure choices, with data preparation often...

The tutorial walks through building an elastic vector‑database simulator that uses consistent hashing with virtual nodes to shard embeddings across distributed storage. It includes a live, interactive ring visualization that shows how adding or removing nodes only reshuffles a tiny...

Vast Data and Nvidia have launched the CNode‑X, a GPU‑powered server that embeds the Vast Data AI Operating System directly onto Nvidia hardware. The integrated solution is optimized for AI pipelines, high‑performance analytics, vector search, retrieval‑augmented generation and agentic workloads....

The United States and the European Union are negotiating the Enhanced Border Security Partnership (EBSP), which would grant visa‑free travel to EU citizens in exchange for access to European biometric databases. The latest draft does not explicitly prohibit the use...
Percona released Operator for MongoDB version 1.22.0, adding automatic Persistent Volume Claim resizing, HashiCorp Vault integration for system user credentials, and native service‑mesh compatibility via the appProtocol field. The update also expands backup and restore capabilities, including replica‑set name remapping,...