Strategic Whitepaper

CALIFORNIA'S AI OPPORTUNITY

Interoperability, Enterprise Planning, and the Logistics Network Government Didn't Know It Had
Jason Shearer Chief Architect & AI Lead, U.S. Public Services
Business Transformation & Architecture, North America
CCST AI Applications and Innovation Showcase
March 4, 2026
Scroll to read
Contact Me Share via Email

Contents

Executive Summary

California's public agencies face a convergence of mandates that share a common dependency: interoperability.

GASB 104, effective this fiscal year, requires every state and local government to separately disclose categories of capital assets it has never had to break out before. GASB 96 already put every IT subscription on the balance sheet as a right-to-use asset. The federal SAMOSA Act, which passed the House in December 2025, mandates comprehensive software asset inventories across all agencies. NIEM, the National Information Exchange Model, is moving from policy preference toward procurement requirement for federally funded programs in emergency management, homeland security, and defense. California's own Envision 2026 commits the state to interoperability, digital identity, and innovation procurement.

None of these mandates can be met in isolation. Asset disclosure requires cost allocation. Cost allocation requires integrated planning. Planning requires data that moves across organizational boundaries. Data exchange requires identity verification. Identity verification requires open standards. Each mandate pulls on the same thread: systems, organizations, and data must work together across boundaries of scale, maturity, and jurisdiction.

This paper makes a specific claim: the infrastructure to meet these mandates largely exists on a platform California agencies are already running. What it has not done is connect. The paper is organized around a progression that builds from connection to simulation, with each step serving as an independent pilot:

ConnectBusiness Network
EquipSAP Build + Mobile
SenseEdge AI + Cameras
PlanxP&A + Cost Transparency
TwinDigital Twin
SimulateOmniverse
Each step is a pilot. Each pilot delivers value independently. Each builds toward a digital twin of the enterprise: a working model of how California's public infrastructure actually operates, where decisions can be tested before they are made.

Most California state and local government customers already run an enterprise platform for finance, HR, and procurement. Many already use Ariba. What they do not know is that they sit adjacent to a business network of over 15 million trading partners: utilities, food distributors, logistics providers, and warehouse operators. They can participate in logistics orchestration, emergency supply staging, and cross-agency coordination without procuring dedicated systems. The network is there. It has not been turned on.

The same platform produces the cost transparency that GASB 104 demands, the NIEM-conformant data exchange that federal coordination requires, and the verifiable identity infrastructure that makes data sharing trustworthy without a central broker. The architecture is open source, built on standards running in production across European data spaces, and designed so that no single vendor controls the infrastructure. When a legislator asks "does this lock us in?", the answer is documented across a decade of open source contributions from Cloud Foundry to Eclipse Tractus-X.

Seven pilots are proposed. Each can start within six months. Each addresses a real California problem: utility coordination during wildfires, school food logistics, donation management and workforce development, integrated public planning, edge AI for field service, cost transparency for legislative accountability, and a federated data space for small utilities that today cannot share critical data with the agencies responsible for keeping their communities safe.

This is not a proposal to buy new software. It is a map showing California how to meet the mandates it already faces, solve the coordination problems it already knows about, and build a digital twin of public infrastructure that makes the next crisis cheaper, faster, and less deadly to manage.

Chapter 1: The Interoperability Imperative

The Universal Pattern

Across infrastructure, housing, and public services, the same structural problem repeats. Large organizations operate advanced digital systems. Smaller organizations, which often serve the most vulnerable populations, operate with minimal or no digital capability. During emergencies, this gap becomes a coordination failure.

This is not specific to California. It appears wherever infrastructure responsibility is distributed across organizations of vastly different size and digital maturity. But California, with its combination of wildfire exposure, seismic risk, housing pressure, and utility complexity, faces this pattern at a scale few other jurisdictions match.

California's Moment

Three converging priorities define the current policy environment: wildfire resilience, housing crisis response, and digital government modernization.

Following the 2025 Los Angeles fires, the state deployed its first statewide LiDAR mapping effort, expanded the world's largest aerial firefighting fleet, and nearly doubled CAL FIRE's protection budget from $2 billion to $3.8 billion. In February 2026, a new CPUC President was named with an explicit mandate for wildfire safety spending, grid hardening, and utility accountability.

Beneath this investment lies a structural coordination gap. During the fires, emergency coordination entities discovered that critical spatial data about infrastructure assets could not be shared between small utilities and large providers. The systems were never designed for cross-organizational interoperability. Investment at the top had not reached the coordination layer at the bottom.

The housing challenge follows the same pattern. More than 540 jurisdictions submit Annual Progress Reports through fragmented, incompatible systems. The citizen services challenge follows it too: one in five Californians lack reliable internet access, while digital services assume broadband connectivity.

In each case, the barrier is not a lack of technology at any single organization. It is the absence of an interoperability layer that allows technology to work across organizational boundaries.

California's AI Legislative Leadership

California has established itself as the nation's leading AI regulator. In the first half of the 2025–2026 session, lawmakers introduced more than 33 AI and privacy bills.

Legislation Description Significance
SB 53 Transparency in Frontier AI Act First state law regulating frontier AI model developers
AB 316 AI Civil Liability Bars AI autonomy as a defense in civil actions
AB 853 AI Transparency Act (Amended) Expands content provenance requirements; effective August 2026

The 2026 session is expected to advance bills targeting employment AI systems, biometric data protections, and automated decision-making in high-stakes contexts.

This Has Been Solved Before

California is not the first jurisdiction to face this challenge. Europe has built, tested, and deployed a working model. Understanding the pattern matters because the same architecture applies directly to California's problem.

The Catena-X initiative, founded by BMW, Deutsche Telekom, Bosch, SAP, Siemens, and others, created the first large-scale federated data space for industry. Its open-source implementation, Eclipse Tractus-X, enables secure, sovereign data exchange between organizations that do not share common systems. Three principles matter here:

Data sovereignty: Each participant retains control of their data. Small organizations share only what is necessary for specific coordination scenarios, nothing more. Participants verify each other's identity through self-sovereign credentials rather than a central broker (Chapter 6 describes the technical mechanism).

Federated integration: The Eclipse Dataspace Connector enables exchange regardless of internal platform. An organization on paper can participate alongside one running advanced GIS.

Open standards: Built on the Gaia-X trust framework and International Data Spaces Association principles. No single vendor controls the infrastructure.

In parallel, the European Commission's Data Space for Smart and Sustainable Cities and Communities (DS4SSCC) is deploying federated data spaces for municipal governments. The initiative is led by the Open & Agile Smart Cities network (OASC), Eurocities, and the FIWARE Foundation, which provides the operational data models and NGSI-LD API schemas that define how mobility, energy, environment, and public safety data is structured and exchanged within these spaces. FIWARE Smart Data Models are to the data space what NIEM vocabularies are to U.S. government exchange: the shared language that makes interoperability work.

DS4SSCC-DEP (the deployment phase) runs through September 2026 with an €18 million budget. Three rounds of open calls have selected pilots across Europe. The sectors map directly to California's challenges: cross-border mobility and pollution management between Szeged and Timișoara. Positive energy districts in Spain and Bulgaria, where neighborhoods produce more energy than they consume. Participatory urban planning across Eindhoven, Oulu, and Álava. Each pilot validates the same blueprint: resource-constrained local governments participating in data ecosystems alongside well-resourced organizations, without adopting expensive new platforms.

The U.S. Parallel: NIEM and the Coming Mandate

The United States has its own interoperability standard, and its trajectory toward mandate is accelerating.

The National Information Exchange Model (NIEM) defines a common vocabulary and exchange format for sharing data across jurisdictions and agencies. It originated in 2005 as a partnership between the Departments of Justice and Homeland Security. It now spans DoJ, DHS, DoD, and HHS. In 2013, DoD adopted a "NIEM first" policy requiring NIEM-conformant data exchange for defense information sharing. NIEM is cited as a key enabler in the Joint All-Domain Command and Control (JADC2) Reference Architecture. In 2023, NIEM transitioned to NIEMOpen under OASIS, formalizing it as an international standard eligible for inclusion in procurement requirements worldwide.

NIEM already defines domains for emergency management, infrastructure protection, justice, human services, and military operations. Its Information Exchange Package Documentation (IEPD) format specifies how data elements are structured, validated, and exchanged between systems that have never communicated before. This is not aspirational. Over 7,000 DoD staff use NIEM-based systems through the Warfighter Mission Area Architecture Federation and Integration Portal. Law enforcement uses NIEM through LInX, one of the largest information sharing systems in the country. The Maritime Information Sharing Environment (MISE) uses NIEM across 12 federal and defense partners.

For California, the NIEM trajectory creates both an obligation and an opportunity.

The obligation: as NIEM compliance moves from policy preference to procurement requirement across federal programs, California agencies that receive federal emergency management funding, homeland security grants, or defense-related contracts will need NIEM-conformant data exchange. The state's utility coordination infrastructure, its emergency response data, and its cross-jurisdictional service delivery will all face this requirement.

The opportunity: California's largest utilities are already SAP customers. Southern California Edison, SDG&E, and the Department of Water Resources all run SAP for core operations. The data that needs to become NIEM-conformant (outage reports, asset conditions, crew availability, mutual aid requests) already exists in these systems. A data space architecture built on SAP's Data Space Integration and Decentralized Identity Verification can produce NIEM-conformant exchange packages from enterprise data that is already being captured. The small utilities that cannot afford dedicated systems get NIEM compliance through the shared data layer, the same way they get logistics coordination through the Business Network.

Tractus-X and NIEM solve the same problem from different starting points. Europe started with industry consortia and built toward government. The U.S. started with government and is building toward open standards. The architectural patterns converge: federated data exchange, sovereign identity verification, standard vocabularies, no central broker. SAP's position in both ecosystems is the bridge.

Chapter 2: The Open Source Infrastructure You Didn't Know Existed

A Decade of Building

A legislator reviewing this paper will ask a reasonable question: does this lock California into a single vendor? The answer matters because the state's Envision 2026 strategy explicitly requires open standards and innovation procurement. The answer is also documented across a decade of open source contributions that form the foundation of every interoperability capability described here.

Year Initiative Significance
2014–2018 Cloud Foundry Foundation Founding platinum member. Open-source cloud application platform now used across government and enterprise.
2018 Kyma Runtime Open-source Kubernetes-based extensibility platform. Donated to community. Now the foundation for SAP BTP extensions.
2019 Eclipse Foundation membership Open governance for enterprise software.
2021 Catena-X / Tractus-X Co-founded the automotive data space. Eclipse Tractus-X released as open-source reference implementation.
2022–2024 ABAP Cloud SDK, CAP Framework Open developer tooling on GitHub for building enterprise applications.
2025 Gaia-X and IPCEI-CIS alignment Participation in European sovereign cloud and digital identity infrastructure.
2023–2026 DS4SSCC / FIWARE Foundation Co-participant in the Data Space for Smart and Sustainable Cities and Communities. FIWARE provides NGSI-LD API schemas and Smart Data Models for cross-sector urban data exchange.
2025 Decentralized Identity Verification (DIV) BTP service for self-sovereign identity: verifiable credentials, DID:Web, Decentralized Claims Protocol. In production at Catena-X.

This is a company that has spent a decade building connective tissue between enterprise systems and open ecosystems. That connective tissue is what California's interoperability challenge requires.

Sovereign Cloud and Data Sovereignty

For government customers, where data lives and who controls it is not negotiable. SAP's sovereign cloud positioning includes dedicated government cloud instances, FedRAMP authorization pathways, and architectural alignment with Gaia-X data sovereignty principles.

The federated data space model described in Chapter 1 does not require centralization. Each participating organization retains sovereignty over its data, sharing only what is needed under explicit, auditable agreements. This model supports local autonomy: jurisdictions retain control of their systems while participating in shared standards.

Why This Matters for SLED

State and local government customers interact with SAP in the ERP room: finance, HR, procurement. They have not been shown the open source integration layer, the data space architecture, or the business network. This paper aims to change that. Not by proposing new procurement, but by showing what is already adjacent to what they own.

Chapter 3: The Business Network as Public Infrastructure

The Logistics Partner Government Didn't Know It Had

When FEMA coordinates disaster response with a California county, the data exchange runs on phone calls, email, and PDF attachments. When CalOES stages emergency supplies, visibility into warehouse capacity and carrier availability depends on personal relationships, not system integration. The interoperability mandates described in Chapter 1 exist precisely because this coordination layer does not.

This chapter describes the coordination layer. It already exists. It has not been turned on for government.

SAP Business Network connects over 15 million trading partners across logistics, procurement, and asset management. When a California city uses Ariba for procurement, it is already connected to this network. What it likely does not know is that the same network supports:

Freight Collaboration: Visibility into shipments, carrier performance, and transportation capacity across logistics providers. The city does not need to own or operate a transportation management system.

Asset Collaboration: Shared visibility into asset condition, maintenance schedules, and lifecycle data across organizations that co-manage infrastructure: utilities, public works, contractors.

Supply Chain Collaboration: Real-time coordination with suppliers, distributors, and third-party logistics providers for everything from emergency supplies to school food distribution.

SLED customers do not need to buy TM (Transportation Management) or EWM (Extended Warehouse Management) to participate in logistics orchestration. They can connect through the Business Network to utilities, food distributors, 3PLs, and warehouse operators who are already there. The coordination layer exists. It has not been activated for government use cases.

Example: A county emergency management office coordinates supply staging after a wildfire. Today, the process runs on phone calls and spreadsheets. Through the Business Network, the office sees available warehouse capacity from regional 3PLs, routes freight through carriers already on the network, and tracks delivery in real time. No new logistics system is required. The county connects through its existing Ariba instance.

For organizations that do need physical warehouse capability (a regional distribution center, a school district food hub, a disaster relief staging area) a targeted EWM deployment provides full warehouse management. But many coordination use cases require only the network, not the full stack.

Extended Planning and Analysis (xP&A) for the Public Sector

xP&A replaces siloed planning with integrated planning. Instead of finance planning in one system, supply chain in another, and workforce in a third, organizations plan across all domains in a single environment. Financial forecasts, procurement spend, logistics capacity, asset maintenance, and workforce availability inform each other in real time.

In enterprise, xP&A has produced measurable results. In state and local government, the discipline barely exists. The ERP conversations in government are still about core finance and HCM. No one has walked SLED customers from their financial planning into integrated planning across logistics, assets, and services.

When a city sees freight collaboration data next to its financial plan, next to procurement spend, next to asset maintenance schedules, that is xP&A applied to public infrastructure. The planning becomes predictive. Budget cycles compress. Resource allocation improves. And the data feeding the plans comes from Business Network connections that are already available.

Planning Domain Traditional SLED Approach xP&A Integrated Approach
Financial Planning Annual budget cycle in isolation Continuous planning informed by logistics, procurement, and asset data
Procurement Purchase orders disconnected from supply chain visibility Procurement linked to supplier capacity and delivery forecasts
Logistics Not managed; delegated to individual departments Network-connected freight and supply chain visibility feeding the financial plan
Asset Management Reactive maintenance tracked in spreadsheets Predictive maintenance integrated with capital planning
Workforce Headcount budgeting disconnected from service demand Workforce planning shaped by service volumes, seasonal patterns, and field data

Cost Transparency: The Foundation xP&A Requires

Integrated planning without cost transparency is planning without a foundation. A city can connect its financial plan to procurement and logistics data, but if it cannot show where shared costs actually land, the planning produces numbers no one trusts.

Government budgets are full of pooled costs: facility overhead, IT infrastructure, shared administrative services, fleet maintenance. These costs are real. They are also, in most agencies, invisible at the program level. A legislator asks what it actually costs to deliver wildfire coordination, or school food distribution, or homeless outreach, fully burdened, including the infrastructure underneath. The honest answer today, in most jurisdictions, is: we do not know.

SAP Profitability and Performance Management (PaPM) is an allocation engine built for exactly this problem. It takes pooled costs and distributes them through multi-step, auditable allocation models: driver-based spreads, hierarchical waterfalls, weighted distributions, iterative cycles. In defense contracting, PaPM produces the indirect rate calculations (fringe, overhead, G&A) that survive a DCAA audit. In state government, the same engine can produce the fully burdened cost of any service, program, or jurisdiction.

The allocation models are not simple division. PaPM handles sequential allocation steps where the output of one pool feeds the input of the next. It runs what-if simulations before posting actuals. It pulls from multiple data sources: S/4HANA for financial actuals, the Business Network for logistics costs, EPPM for project costs. The result is a fully attributed cost model where every dollar of shared infrastructure is assigned to the programs it supports, through logic that can be examined, questioned, and defended.

The interface is a CAP application built on SAP's open-source Cloud Application Programming model. CAP runs on BTP. The application surfaces PaPM's allocation results as a clean, queryable cost transparency dashboard. An agency head drills from a total budget down to allocated cost per service. A legislative budget analyst compares fully burdened delivery costs across programs. A program manager sees the real cost of the infrastructure underneath their operation, not just the direct charges on their ledger.

This connects the xP&A story to accountability. Integrated planning tells you where resources should go. Cost transparency tells you where they actually went, and what they actually cost. Together, they give government something it has rarely had: a planning system backed by auditable truth about the cost of delivering services.

For the CCST audience, cost transparency also addresses a political reality. Every dollar of state investment in AI, interoperability, or coordination infrastructure will face scrutiny. PaPM provides the mechanism to show, line by line, what that investment produced and what it cost. That is not a technical capability. That is a governance requirement.

The Compliance Pressure That Makes This Urgent

Two accounting standards are converging to make capital asset transparency a mandate, not a preference.

GASB 96 (effective 2022) reclassified subscription-based IT arrangements as right-to-use assets on the balance sheet. Every cloud subscription, every SaaS contract, every managed service agreement a state agency holds is now a capital asset that must be tracked, amortized, and disclosed. GASB 104 (effective for fiscal years beginning after June 15, 2025) goes further: it requires state and local governments to separately disclose these right-to-use assets from owned capital assets in their financial statement notes. Assets held for sale get their own disclosure requirements. The Governmental Accounting Standards Board is also developing new infrastructure asset reporting requirements that would mandate disclosure of assets more than 80% depreciated.

At the federal level, the SAMOSA Act (Strengthening Agency Management and Oversight of Software Assets) passed the House in December 2025 and awaits Senate action. It mandates comprehensive software asset inventories for all federal agencies within 18 months of enactment, with independent assessments of license management practices. The Federal CIO has already directed agencies to begin inventorying their top software entitlements. A Senate committee estimated annual savings of $750 million from eliminating waste the inventories would reveal.

Federal mandates flow downhill. States that receive federal funding face audit requirements that align with federal standards. California agencies already subject to GASB 96 and 104 will face increasing pressure to demonstrate not just that they hold these assets, but what each asset costs to operate, maintain, and retire.

PaPM closes this loop. The asset master data already exists in SAP for agencies running S/4HANA. GASB 104 requires that assets be categorized and disclosed by type. PaPM allocates the shared costs (facility, IT infrastructure, administrative overhead) that make each asset category's total cost of ownership visible. The CAP dashboard surfaces the result in a format that satisfies both the accounting disclosure and the legislative analyst asking the harder question: what did we get for this money?

A state that deploys PaPM-based cost transparency is not just building a planning tool. It is building a compliance infrastructure that meets GASB disclosure requirements, prepares for SAMOSA-aligned state mandates, and produces the auditable cost evidence that legislators and auditors will increasingly demand.

Chapter 4: Edge Intelligence and the Physical World

Planning and coordination produce better decisions. But decisions happen in the field: on a utility pole, at a school loading dock, at a shelter intake desk. This chapter describes what happens when the interoperability layer reaches the people who do the work.

SAP Build: The Mobile App with AI That Integrates Everything

The interoperability story becomes tangible when a field worker picks up a phone.

SAP Build is a low-code/no-code platform for building mobile applications that integrate with the full enterprise stack: ERP, Business Network, AI services, and edge devices including cameras. For SLED customers, a single development environment produces apps for radically different field scenarios, all feeding data back into the same planning and coordination layer.

Utility field service: A crew member photographs a damaged transformer. The app captures the image, geolocation, and condition assessment. The work order routes through Field Service Management. The Business Network notifies the parts supplier. The planning layer adjusts the maintenance forecast.

Public works: A pothole detection app runs on a phone mounted to a dashboard. The camera photographs damage during routine driving. AI classifies severity. A work order is created. The planning layer prioritizes repair based on traffic volume, safety risk, and budget.

Homeless outreach: A case worker checks real-time shelter bed availability, routes a person to the nearest open bed, and logs the interaction. The data feeds service demand forecasting.

School food logistics: A cafeteria manager scans incoming deliveries. The app verifies quantities against purchase orders and flags discrepancies. The day's menu plan updates.

In every case, the app is built once in SAP Build, connects through standard APIs, and generates data that feeds planning. The field worker needs a phone and five minutes of training.

Digital Manufacturing for Same-Day Logistics

SAP Digital Manufacturing was designed for factory floors: production schedules, quality checks, material flows, equipment management. But a school district central kitchen preparing 50,000 meals a day is a manufacturing operation. A disaster relief staging area sorting and routing supplies is a manufacturing operation. A donation center classifying and distributing contributed goods is a manufacturing operation.

The same platform that manages an automotive assembly line can manage food flow from a central kitchen to 200 school cafeterias. It schedules production runs, tracks ingredients through preparation, coordinates delivery routes through the Business Network, and confirms receipt at each school. Camera systems at the loading dock count pallets and verify shipments. Quality checks are logged. Exceptions flag in real time.

On-Device Vision: CliffordNet and California-Specific AI

The camera capabilities described above require vision AI that runs at the edge, on-device, without cloud latency or connectivity dependencies.

CliffordNet is a geometric-algebra-based neural architecture that uses Clifford Algebraic Networks (CANs) to encode spatial relationships through the geometric product rather than standard convolution. Small model variants deliver disproportionate accuracy per parameter, and the interaction mechanism runs at linear complexity with a sparse rolling attention pattern. On-device deployment is practical even on commodity hardware.

What makes this California-specific is the training data. Every photograph taken at a donation center, every pallet counted at a school loading dock, every pothole captured by a public works app generates labeled training data. Over time, a vision system trained on California warehouse images, California road conditions, and California infrastructure assets becomes measurably more accurate for California use cases. The people using it are training it, with real data in real conditions.

This is sovereign AI in the most practical sense: a model trained on local data, running on local devices, improving through local use, owned by the public agencies that generated the training data.

Loot Locker: A Pilot That Teaches Everything

Loot Locker is an AI-powered donation intake system built on SAP Business Technology Platform. It originated at a hackathon. A person walks up to a donation event, photographs what they are donating, describes it, and the system handles classification, condition scoring, and routing: resell, refurbish, recycle, or dispose.

When the vision model lacks confidence, it asks for help. "What size shoes?" "Remove the clothes from the bag and photograph them separately." "Rescan that QR code." The system knows what it needs.

As a pilot, Loot Locker teaches every concept in this paper at low cost and high visibility:

Concept How Loot Locker Teaches It
Business Network Routes donations to resale partners, recyclers, and distribution points through network collaboration
Edge AI / CliffordNet On-device vision classification in variable lighting and conditions
SAP Build Mobile app built on low-code platform with camera integration
Digital Manufacturing Classification, quality grading, and routing as a production workflow
xP&A Donation volume forecasting, capacity planning, distribution optimization
Training Data Every interaction produces labeled data for model improvement
Workforce Development Participants learn AI-assisted classification, logistics coordination, data quality

The classification and routing logic scales. At a community drive, you sort shoes. At a defense receiving dock, you determine whether an inbound shipment routes to a COMSEC cage, a flight line, or a depot in theater. Same pattern. Different consequences.

Chapter 5: The Digital Twin Progression

A legislator asks: if we fund this interoperability work, what do we get? The answer is a working model of how California's public infrastructure actually operates. Not a dashboard. Not a report. A model you can ask questions of and test decisions against before committing resources. That model is a digital twin.

From Data to Digital Twin

An organization that has connected to the Business Network, equipped field workers with AI-enabled mobile apps, deployed edge sensors, and integrated all of it into a unified planning environment has built the foundation of a digital twin without necessarily naming it.

SAP Business Transformation Management (BTM) provides the framework for a digital twin of the enterprise: a functional model of how the organization operates. Its processes, data flows, resource allocations, and performance against plan. This is not a visualization. It can be queried, analyzed, and used for scenario planning.

For a city government, a digital twin means answering questions that are currently unanswerable. What happens to emergency logistics capacity if a primary warehouse floods? How does a 15% increase in school enrollment affect the food distribution network? If three maintenance crews are redirected to wildfire recovery, what deferred maintenance risk accumulates on water infrastructure?

Integrated Product Development and Infrastructure Management

The digital twin extends into Integrated Product Development (IPD), a discipline from aerospace and manufacturing where complex physical assets are managed through their entire lifecycle: design, production, operation, maintenance, retirement.

Public infrastructure is a complex physical asset. A bridge has a design specification, a construction history, an operational profile, a maintenance schedule, and an end-of-life plan. So does a water treatment plant, a school building, and a power substation. Managing these assets the way aerospace manages aircraft, with full lifecycle digital twins, predictive maintenance, and integrated planning, is not aspirational. The technology exists. The gap is in applying it.

The NVIDIA Omniverse Partnership

When the digital twin needs spatial visualization, the model extends into simulation. A city planner needs to see traffic flow when a new transit line opens. An emergency manager needs to test evacuation routes during a wildfire.

SAP's partnership with NVIDIA Omniverse connects enterprise operational data to physically accurate 3D simulation environments. Operational data (asset conditions, logistics flows, workforce positions) comes from SAP. Spatial rendering comes from Omniverse. The result is a simulation environment where policy decisions can be tested before implementation, with real data.

This is the final step: Connect, Equip, Sense, Plan, Twin, Simulate. Each step builds on the previous one. Each is a pilot in its own right. The entire progression runs on infrastructure that California agencies are already paying for or can access through the Business Network.

Chapter 6: Digital Identity and Citizen Services

The Same Pattern, Applied to People

The interoperability challenge appears again in citizen services. A California resident displaced by wildfires needs DMV replacement, emergency benefits, utility transfers, and housing assistance. Each requires separate identity verification through separate systems. For the one in five Californians who lack reliable internet, these barriers compound.

The California Identity Gateway, part of the Envision 2026 strategy, addresses identity fragmentation. This aligns with the European IPCEI-CIS (Important Project of Common European Interest, Cloud Infrastructure and Services) initiative, which is building federated digital identity infrastructure across member states. The architectural pattern matches the utility data space: federated, sovereign, interoperable. Each agency keeps its identity systems while participating in a shared verification layer.

But identity is not just a citizen services problem. It is the missing piece of the data space architecture described in Chapter 1. Before two organizations can share data in a federated model, they must verify each other's identity without relying on a central authority. Identity is the gate that makes data sovereignty enforceable.

Why the U.S. Stalled on Decentralized Identity

The technology for decentralized identity verification exists. Europe is running it in production. The U.S. is not, and the reason matters for California's planning.

In 2022, a DARPA-commissioned study by Trail of Bits found that blockchain implementations contained structural weaknesses: unintended centralities, vulnerable node software, traffic concentration through a small number of ISPs. The cryptography was sound. The infrastructure was not. Since blockchain had been the leading candidate for decentralized identity, the report froze U.S. government momentum.

Europe responded differently. Rather than abandoning decentralized identity, European initiatives separated the concept of self-sovereign identity (SSI) from blockchain. SSI requires three things: decentralized identifiers (DIDs), verifiable credentials issued against those identifiers, and a way to verify credentials without calling home to a central authority. The W3C standards for DIDs and Verifiable Credentials provide all three. The DID:Web method uses standard web protocols. No chain. No mining. No unintended centralities. This is the identity model now running in production at Catena-X.

SAP Decentralized Identity Verification (DIV)

SAP Decentralized Identity Verification is a BTP service that implements self-sovereign identity for inter-organizational communication. It enables organizations to sign, verify, and manage verifiable credentials through an administration interface and API. The service is already deployed in production dataspaces.

In the Catena-X dataspace, DIV handles the identification step of the Decentralized Claims Protocol. When a data consumer requests data from a provider, the consumer presents verifiable credentials issued by a trusted authority (the dataspace operator, a standards body like GLEIF). The provider's Data Space Integration service checks the credentials against its access policies. If the credentials are valid, the contract negotiation and data exchange proceed. No central identity broker is involved. The verification is cryptographic, auditable, and sovereign.

The same architecture applies directly to the utility data space described in Chapter 1. When a small utility shares asset data with a large IOU during an emergency, DIV verifies both parties' identities through credentials issued by a trusted authority (the CPUC, a utility coordination entity, or the dataspace operator itself). The small utility does not hand over control of its identity to the large utility. It presents a credential. The credential is verified. The data flows. When the coordination scenario ends, the data sharing stops. The identities remain sovereign.

SAP's Data Space Integration capability (part of SAP Integration Suite) implements the general Dataspace Protocol for data exchange. DIV handles the identity layer. Together, they provide the complete technical foundation for federated data spaces on BTP. When the data exchanged must conform to NIEM standards (as will increasingly be required for emergency management and cross-jurisdictional coordination), the exchange packages can be structured as NIEM-conformant IEPDs, with DIV verifying participant identity and Data Space Integration handling the transport. The result: verified organizations exchanging standardized data, with no central broker and no vendor lock-in on either side.

The MOSA Principle Applied to Identity

For technologists familiar with defense acquisition, the architecture maps to a known principle.

The Modular Open Systems Approach (MOSA) mandates that systems be composable, interoperable, and vendor-independent. MOSA was designed for hardware: modular weapons systems, open avionics standards, interchangeable sensor payloads. The principle is that no single vendor should own the interface between components.

Applying MOSA to identity is a conceptual shift. Traditional identity and access management is monolithic: one IAM system controls authentication and authorization for an entire organization. Federated identity (SAML, OIDC) distributes authentication across providers but still relies on bilateral trust relationships that become brittle at scale. Self-sovereign identity, implemented through verifiable credentials and decentralized identifiers, makes identity itself a composable, interoperable component. An organization's identity is not locked into any single provider's IAM system. It is a credential that can be verified by any party that trusts the issuer.

This is a familiar idea in Europe, where SSI is already operating in production dataspaces. In the U.S. defense and government context, where the Baron report created justified skepticism about blockchain-based approaches, it represents a path forward that addresses DARPA's concerns while delivering the decentralization that data sovereignty requires. The identity is decentralized. The verification is cryptographic. The implementation runs on standard web protocols, not blockchain infrastructure.

For California, this means the Identity Gateway does not need to be a centralized system. It can be a trust framework: a set of credential schemas, trusted issuers, and verification policies that allow agencies to verify each other and verify residents through credentials rather than through point-to-point integrations. The same DIV infrastructure that verifies utility identities in a data space can verify agency identities in citizen services. The same verifiable credential that proves a resident's eligibility for emergency benefits can prove their identity at the DMV, without either agency sharing backend systems.

Zero-Knowledge Proofs: Prove the Claim, Protect the Data

Verifiable credentials solve the trust problem. Zero-knowledge proofs solve the privacy problem that comes with it.

A zero-knowledge proof (ZKP) allows one party to prove a statement is true without revealing the data behind it. In the context of SSI, this means selective disclosure: a credential holder proves they meet a requirement without exposing the full credential. A resident proves they are over 18 without revealing their birthdate. An agency proves it holds a valid operating license without exposing its financial statements. A utility proves it meets CPUC safety standards without sharing proprietary grid data with competitors.

This is not theoretical. W3C Verifiable Credentials support ZKP-based selective disclosure through BBS+ signatures and related cryptographic schemes. The European Union Digital Identity Wallet (EUDIW) program is building ZKP capabilities into its SSI architecture to comply with GDPR data minimization requirements. The principle: collect and share only the minimum data necessary for the transaction.

For the compliance pressures described in Chapter 3, ZKPs provide a critical bridge. A state agency can prove GASB 104 compliance to a legislative auditor through a verifiable credential without exposing the full asset inventory to every party in the verification chain. A small utility in the data space can prove it holds valid credentials to participate in emergency coordination without revealing its financial position to a large IOU. Identity verification and compliance attestation happen simultaneously, with privacy preserved by design.

SAP's DIV service, built on W3C standards, provides the infrastructure for issuing, holding, and verifying these credentials. The ZKP layer is where identity, compliance, and privacy converge into a single architectural pattern. The identity is sovereign. The proof is selective. The verification is cryptographic. The data stays where it belongs.

Multi-Channel Service Delivery

Multi-channel service delivery, where a single backend supports web, phone, SMS, kiosks, and offline-capable mobile, is a proven pattern with measurable outcomes in comparable government deployments worldwide. The design principle: digital inclusion is an architectural requirement, not an add-on. Services must work for a resident on gigabit fiber and for a resident on intermittent cellular in a rural area that has just been evacuated.

The equity dimension connects to the infrastructure story. The same rural communities served by under-resourced utilities are the most digitally excluded and the most vulnerable to wildfire.

Chapter 7: Responsible AI and Governance

California introduced more than 33 AI bills in the first half of the 2025-2026 session. SB 53 regulates frontier model developers. AB 316 bars AI autonomy as a defense in civil actions. AB 853 expands content provenance requirements. Every capability in this paper is designed to operate within this framework and the responsible AI principles established by the Governor's Frontier AI Working Group.

Human Oversight and Accountability

All AI recommendations require human review before action. In utility coordination, AI surfaces priorities; human operators decide deployment. In logistics, AI optimizes routes; human dispatchers confirm. In citizen services, AI handles routine inquiries; complex cases escalate to trained staff. Every AI decision is logged with an auditable trail.

Transparency and Explainability

Consistent with SB 53 and AB 853, all proposed systems are designed for transparency. Model capabilities and limitations are documented. Users can understand why the AI made a recommendation. AI-generated content is labeled. Risk assessments include confidence indicators.

Digital Inclusion as Design Principle

Multi-channel access reaches residents regardless of connectivity. Multi-lingual support reflects California's diversity. Shared service models provide capability to under-resourced organizations. Offline-capable solutions serve communities where broadband has not arrived or has been destroyed.

Data Sovereignty and Privacy

The Tractus-X, DS4SSCC, and FIWARE data space models show that federated data sharing does not require centralized data collection. Each organization retains sovereignty over its data. Jurisdictions retain control of their systems while participating in shared standards. This is a design principle, not a concession.

Chapter 8: Pilots and Next Steps

The mandates described in this paper (GASB 104, NIEM conformance, SAMOSA, Envision 2026) are on calendars, not wish lists. The progression (connect, equip, sense, plan, twin, simulate) is not a monolithic program. It is a series of pilots. Each delivers value independently. Each meets a specific compliance or coordination requirement. Together, they build toward a digital twin of California's public infrastructure.

Pilot What It Proves Entry Point Timeline
Business Network Logistics SLED customers can coordinate freight, supplies, and warehouse capacity through the network without procuring TM/EWM One city or county with Ariba 6 months
Loot Locker AI classification, edge vision, training data generation, and workforce development in a single deployment One community partner or school district 3–6 months
xP&A Planning Integrated planning across finance, procurement, logistics, and assets gives public agencies predictive capability One state agency on S/4HANA 6–12 months
Edge Vision for Field Service On-device CliffordNet models trained on California imagery deliver value for pothole detection, asset inspection, or shelter routing One public works or outreach department 6 months
Utility Data Space Small utilities share critical asset data with large IOUs and emergency management through a NIEM-conformant, federated layer modeled on Tractus-X 3–5 small utilities in fire-prone regions 6–12 months
School Kitchen Digital Manufacturing Same-day food logistics orchestrated through Digital Manufacturing with camera-based quality verification One school district central kitchen 6 months
Cost Transparency (PaPM + CAP) Fully burdened cost-per-service model with auditable allocation logic, meeting GASB 104 capital asset disclosure requirements and surfaced through an open-source CAP application One state agency modeling the true cost of a specific service 6 months

Workforce Development

Every pilot creates a training pathway. Loot Locker trains participants on AI-assisted classification. The warehouse vision system trains staff on edge deployment. The Business Network pilot trains procurement teams on logistics collaboration. The xP&A pilot trains financial analysts on integrated planning. The cost transparency pilot trains budget staff on allocation modeling, auditable cost attribution, and GASB compliance reporting. For state and local government, workforce development matters as much as operational outcomes. It should be measured alongside them.

Proposed Immediate Actions

1. Convene a working session with utility coordination entities, CPUC, and emergency management stakeholders to scope the utility data space pilot, using the Tractus-X governance model and NIEM exchange standards as the foundation.

2. Identify one city or county on Ariba to pilot Business Network logistics coordination for emergency supply staging or routine distribution.

3. Deploy Loot Locker as a pilot that teaches edge AI, training data generation, and workforce development simultaneously.

4. Engage a state agency on S/4HANA for an xP&A proof of concept across finance, procurement, and assets.

5. Model the true cost of one service using PaPM allocation logic and a CAP application dashboard, producing GASB 104-compliant asset disclosures alongside fully burdened cost transparency from shared infrastructure through program delivery.

"Enterprise Architects have a unique opportunity to interact with customers not transactionally, but transformationally."

Jason Shearer

The infrastructure is closer than anyone in the room thinks. California has the opportunity to lead.

Appendix A: Global Evidence Base

The following deployments provide evidence that the capabilities described in this paper are proven in production. Each entry links to its published Innovation Award submission.

California and U.S. Government

Customer Key Outcome Case Study
CA Dept of Water Resources Billing automation: days to hours View →
Southern California Edison 48.6% process efficiency; 19.2% IT cost reduction View →
SDG&E 2.7M users; 5-star mobile platform View →
DTE Energy (Michigan) 397,000+ shutoffs prevented; $88M+ in assistance View →
National Grid US 30% training engagement increase via GenAI View →
Defense Logistics Agency 25% productivity increase; $46B supply chain View →
U.S. Dept of the Interior 87 legacy systems decommissioned; 26M transactions View →
U.S. Dept of Justice 30% IT cost reduction; 9.5/10 user satisfaction View →
Lockheed Martin (1LMX) Unified AI across 50+ apps; 120K employees View →
Commonwealth of PA 10x reporting speed improvement View →

Innovative City and State Deployments (Global)

Customer Key Outcome Case Study
City of Hagen (Germany) 90% budget evaluation time reduction; €1B budget View →
City of Vancouver Budget cycle: 7 weeks to 3 weeks View →
City of Antibes (France) 30% workload reduction; AI-powered procurement View →
German Federal Foreign Office 50% manual reduction; 83.2% same-day resolution View →
Spain Public Employment Service 125,000 monthly benefits; 20% productivity gain View →
Canton Zurich Budget sources: 120 to 1; full compliance View →
Transport for NSW A$211B asset portfolio on unified platform View →

Utility and Energy

Customer Key Outcome Case Study
Thames Water (UK) 85% efficiency gain; £2.2M combined savings View →
E.ON (Germany) GenAI-powered finance; 40% efficiency increase View →
Essent (Netherlands) Monthly processing: 36+ hours to <10 hours View →
GS Inima (Spain) Digital twin: 20% cost reduction; 30% resource gain View →
Tata Power (India) 50% financial closure improvement; 30% IT cost cut View →

Enterprise AI and Supply Chain

Customer Key Outcome Case Study
Bosch Power Tools 1.5M tickets/year; Joule AI: 35% accuracy gain View →
AutoScheduler.AI Warehouse planning: 8 hours to 20 minutes View →
SBB AG (Swiss Rail) AI predictive maintenance; lifecycle extension View →
Embraer 41,000 man-hours saved; 5% FTE productivity gain View →
Honeywell 94% on-time payment (from 74%); 96% first-pass yield View →

Appendix B: Enterprise AI Capabilities Reference

AI capabilities available today in enterprise platforms, mapped to roadmap items and Discovery Center pages. For existing California government and utility customers, many may be included in current license entitlements.

Legend: GA = Generally Available; Planned = on roadmap, not yet released.

Utility Operations & Field Service

CapabilityWhat It DoesReferenceStatus
Intelligent Scheduling AI optimizes technician routes and job assignments by skills, location, priority Roadmap GA
Predictive Maintenance Predicts equipment failures using sensor and historical data Roadmap GA
Customer Self-Service Agent Natural language service requests routed to resolution; ~45% FTE cost reduction, ~90% handling cost reduction, 10x ROI Discovery Center GA
Smart Meter Integration High-volume meter data processing for billing and consumption analytics Roadmap GA
Consumption Forecasting Predicts utility demand based on weather, history, and events Roadmap GA

Emergency Response & Logistics

CapabilityWhat It DoesReferenceStatus
Demand Sensing Predicts short-term demand changes from external signals Roadmap GA
Route Optimization Calculates optimal delivery routes with real-time constraints Roadmap GA
Intelligent Warehouse Slotting Optimizes storage locations based on movement patterns Roadmap GA
Supplier Risk Monitoring Flags supply chain disruptions before impact Roadmap GA
Supply Chain Control Tower Single real-time view across suppliers, logistics, inventory Roadmap GA

Document Intelligence & Compliance

CapabilityWhat It DoesReferenceStatus
Document Information Extraction Extracts structured data from invoices and forms Discovery Center GA
Document Summarization Summarizes documents into key points and action items Discovery Center GA
Compliance Monitoring Monitors transactions for regulatory compliance flags Roadmap GA
Custom Document Processing Build extraction pipelines for specific document formats Roadmap GA

Citizen Services & AI Assistants

CapabilityWhat It DoesReferenceStatus
Joule AI Copilot Natural language assistant embedded across enterprise applications Discovery Center GA
AI Case Classification Categorizes and routes inquiries to correct resolution Discovery Center Planned
Sentiment Analysis Detects citizen/customer sentiment in communications Discovery Center GA
Personalization Engine Tailors digital experiences based on user context and history Roadmap GA
Decentralized Identity Verification (DIV) Self-sovereign identity for inter-organizational verification; verifiable credentials, DID:Web, Decentralized Claims Protocol Product Page GA

Planning & Analytics

CapabilityWhat It DoesReferenceStatus
Smart Insights Surfaces anomalies and trends humans would miss Roadmap GA
Predictive Planning Forecasts based on historical patterns with what-if scenarios Discovery Center GA
Workforce Analytics Predicts attrition risk and identifies skill gaps Roadmap GA
GenAI Hub Access to foundational models for custom AI development Roadmap GA

Responsible AI

CapabilityResponsible AI DimensionReferenceStatus
Audit Trail Logging Accountability: all AI actions recorded and reviewable Platform capability (SAP AI Core / SAP BTP) GA
Confidence Scores Transparency: users see AI certainty level Roadmap GA
Human-in-the-Loop Controls Oversight: critical actions require human approval Roadmap Planned
Data Masking Privacy: sensitive data protected in AI processing Roadmap GA
Decision Explainability Transparency: reasoning behind AI recommendations Discovery Center GA

Platform Scale

MetricValue
AI features on current roadmap2,916
Products with AI capabilities196
Available today (GA)~60% of roadmap
Utility-specific AI features150+
Public sector AI features100+
Logistics/supply chain AI features300+
Published Innovation Award use cases375+
Business Network trading partners15M+

DRAFT | CCST AI Applications and Innovation Showcase, March 4, 2026

This document supports evidence-based policy dialogue and does not constitute a commercial proposal.

Document version: 4.2
Last updated: February 27, 2026