Work we’ve done

Trading

Automated data pipe with AI to augment trader judgment, detect risk, improve probability and speed.

Digital Twins

Digitising government assets and infrastructure enabling better management, efficiency and security.

Procurement

Reduce risk, costs, delays, mistakes in the supply chain process with automation and AI.

Projects we’ve done

  • Trading

    Client

    Global commodities and equities trader.

    Challenges

    - The traders relied on personal and company wide spreadsheets / balance sheets, informal chat channels, public chat forums, public and private data sources.

    - The datasets are unstructured, sometimes messy and fragmented.

    - There are inconsistent data standards across regions, including languages, units, estimates v actuals, unreliable risk adjusted datasets, an ability to extract numbers or risk patterns from words.

    - Some of the data requires heavy processing, e.g. weather, yields, NASA.

    - Decisions need to make allowances for structural breaks such as wars, sanctions, tariffs, extreme weather etc.

    - Multiple systems and databases, manual spreadsheets, silos.

    - People missing data and making biased, emotive and ill-informed trading decisions.

    - Protecting IP, confidential information, trade secrets and formulas within the client’s balance sheets.

    - Building a clean, real-time, unified data layer can be resource intensive.

    Solution

    Zai Node is delivering a hybrid solution which is a human-led strategy with AI-augmented decision support and automated operational optimisation.

    The first major data fragmentation challenge is solved by our engineers and data scientists building a centralised data lake which auto pipes real time and manual data (e.g. markets, shipping, weather, crop yields, stock reports, credit).

    Using a VLM and LLM to extract data from unstructured sources incluing satellite imagery and public chat forums in many languages.

    We have standardised and cleaned the data and created a “single source of truth”. We think of this as a “digital twin” of a commodities trading balance sheet.

    Using ML model to help detect structural breaks from extreme events and stress-test the client’s models under historical crises, looking for probability and anomaly detection.

    Impact

    Data pipeline solution improves decision making speed and realiability, and eliminates spreadsheet risk. We’re predicting a 1–3% improvement in trading margins from speed and less errors, and a 30% reduction in operational time and resources.

    Institutional knowledge is captured instead of lost. It becomes codified, measurable, repeatable and scalable, moving from a individual balance sheet approach.

    Upshot

    - Clean, structured, automated data pipe.

    - Use AI to augment trader judgment.

    - Automate operations and data processing.

    - Use AI to help collect and translate data, detect risk, improve probability.

  • Digital Twins

    Client

    State government agency responsible for transport, utlities, civic assets.

    Challenges

    - Fragmented data in silos and paper records.

    - No unified real-time visibility across agencies across asset classes, leading to reactive maintenance, breakages, high emergency costs and service disruption.

    - Disconnected capital planning, duplication of works, public scrutiny, competing funding priorities, and a need for transparency.

    - Infrastructure stress and utilisation was not modelled dynamically.

    - Security and sovereignty concerns with expanding attack points.

    Solutions

    Our team’s involvement focused on the centralised, standardised asset data layer integrating BIM, GIS, IoT, sensor data, SCADA and various operational systems.

    We also provided advice on real-time monitoring and predictive analytics, model selection and tuning, and sensor monitoring with predictive maintenance modelling.

    Impact

    - Single source of truth across infrastructure portfolio.

    - Reduced reactive maintenance and service disruptions.

    - Improved emergency response coordination.

    - Evidence-based capital allocation.

    - Enhanced transparency to government and the public.

    - Reduced cyber and operational risk.

  • Procurement supercharged with AI

    Client

    National quick-service restaurant chain

    Challenges

    - Business improvement project revealed considerable costs, cycle times, delays, errors, rework and value loss in the procurement process.

    - There was often uncertainty about what the business rules were, which person can make decisions about purchase categories, what the company has previously paid (and agreed terms) for like categories, and the contracting process.

    - Creating RFIs, RFPs, assessing bids and commercials was manual and often didn’t refer to historical data.

    - Ave 132 total labour hrs per contract until commencement of deliverables, and 11 weeks cycle time.

    Solution

    _ Current state analysis to better understand the problem, root causes, goals, and relevant data and systems.

    - Clean, structure and optimise existing datasets.

    - Use a public model to assist draft RFI/RFP templates, tailor questions by category, summarise vendor responses and highlight non-compliance.

    - For bid responses we used a VLM to extract pricing, delivery timelines, and compliance statements from vendor submissions automatically.

    - To assess the bids we used an LLM based classification model trained on historical award data, and then score suppliers on cost, risk, quality, and other defined metrics.

    - Agentic AI tool analyses vendor documents for risk and compliance issues (e.g., sanctions, ESG policies, cyber posture, AML, Chain of Custody etc.) using Hugging Face NER pipelines and GPT-based RAG over vendor data.

    - Internal approvals used Power Automate, agentic AI and MCP servers to process mine and task automate. Identify bottlenecks, predict approval delays and route workflows dynamically.

    - Predict spend by category, supplier, or region, detect anomalies or non-compliant spend, using classical statistical approaches combined with fine-tuned LLM for post-analysis and Power BI.

    - Finally, creating a Procurement Assistant chatbot, RAG with text embeddings and our custom RAG fine-tuned LLM. E.g. “Show top suppliers by A, B and C.”

    Impact

    70–80% reduction in manual labour from “we need to buy” until contract delivery start.

    55% faster cycle times from RFI to contract signed.

    Full audit trail, risk visibility and decision consistency.

    Accurate single source procurement spend forecast across whole group.

    Labour average is 45hrs per contract (down from >130).

    All FAQs are handled online by Q&A chat with only 12% going to human.

Local

Private

AI

Local,

private

AI

What happens now to data used in public AI?

  • 01 User submits chats, queries, docs into web based public AI model

    Once the data is submitted, it’s visible to the public AI model owner — it must be visible for processing. Data transmitted across the internet isn’t inherently visible, however, it does mean there are more third parties involved and there’s more chance of interception.

  • 02 The public AI infra machine

    Most public AI companies use data submitted by users to “improve their services” and, if not explicitly requested, they use for “training purposes”. The public AI infra stack normally includes their: backend OS; LLM; model training; product improvement; and data storage. Note that most public AI stores user data which creates more data breach risk.

  • 03 Third parties process much of users' data

    Most public AI companies send user data to many third party processes. E.g., OpenAI uses Microsoft, Cloudflare, CoreWeave, Oracle, Google, Snowflake, Salesforce and many more, and in numerous countries. OpenAI’s dynamic Sub-Processor List is here.

  • 04 The upshot

    User and company data, personal and confidential info, and IP are shared with many unknown third parties in different countries for commercial reasons that don’t benefit the user. This increases company costs and may lead to possible harm later from data misuse.

What happens to data with Zainode’s AI on the client’s infra (DC or on-prem)?

  • 01 User submits chats, queries, docs into Zainode's web app

    Once the data is submitted, it’s visible within the client’s infra, only to the client and who they delegate access to. Client is in complete control of how their users connect to Zainode app, e.g. VPNs, offline completely, or other strong mechanisms of private transmission.

  • 02 Client's private infra, provided by Zainode

    Zainode provides intake web app, LLM, servers and data storage. Note that Zainode’s data storage app does store data, but only in the client’s infra — the client is in full control of access, retention, deletion.

  • 03 No third parties at all

    None. Zero. Zainode’s private AI solution uses no third party data processors.

  • 04 The upshot

    User and company data, personal and confidential info, and IP are stored by the client, in the client’s complete control. This considerably mitigates data misuse and reduces tech costs.

    Unlike most public AI, all AI generated output by Zainode is provided with evidence and reasoning.

What happens to data with Zainode’s AI on Zainode infra?

  • 01 User submits chats, queries, docs into Zainode's web app

    Once the data is submitted, it’s visible only to the client and who they delegate access to. Zainode can connect directly to a client’s network to minimize parties involved in data transmission and to client authentication system so users are centrally managed. Client has full log of all connections and use.

  • 02 Zainode's infra, logically & physically isolated per client

    Zainode provides intake web app, LLM, servers and data storage. Note that Zainode’s data storage app does store data, but the client is in full control of access, retention, deletion. No tenancy is shared between clients. Full data logging available.

  • 03 No third parties at all

    None. Zero. Zainode’s private AI solution uses no third party data processors.

  • 04 The upshot

    User and company data, personal and confidential info, and IP are stored by us in the client’s single tenancy, in your complete control. This considerably mitigates data misuse and reduces tech costs.

    Unlike most public AI, all AI generated output by Zainode is provided with evidence and reasoning.

Your model, your data, your AI

  • Each client accesses single-tenant application, with fully customizable interface showing your brand. Your staff can use your AI model with your company data.

  • Know everything in and out of AI, including AI’s reasoning, enabling better explanation of AI’s decisions and advice, and liability and risk management. Prompts and outputs stay entirely within your environment — we don’t access the data you process.

  • Zai Node’s infrastructure handles the computing demands of running modern-day AI inference workloads, helping bring your innovation to market privately and faster. Clients access the latest GPUs for ultra-high performance, low cost and ease of use. Your data is transiently processed by your GPUs.

  • Safely and securely unlock the benefits of AI. Organizations who keep AI closed to their data unlock powerful insights and competitive advantages — all while maintaining trust and compliance.

  • Zai Node avoids the expense, cost uncertainty, wastage and lack of visibility with the public cloud.

Your model, your data, your AI

Custom prompt application

Each client accesses single-tenant application, with fully customizable interface showing your brand. Your staff can use your AI model with your company data.

  • Know everything in and out of AI, including AI’s reasoning, enabling better explanation of AI’s decisions and advice, and liability and risk management. Prompts and outputs stay entirely within your environment — we don’t access the data you process.

Purpose built infra, not cloud

Zai Node’s infrastructure handles the computing demands of running modern-day AI inference workloads, helping bring your innovation to market privately and faster. Clients access the latest GPUs for ultra-high performance, low cost and ease of use. Your data is transiently processed by your GPUs.

Trust & compliance

Safely and securely unlock the benefits of AI. Organizations who keep AI closed to their data unlock powerful insights and competitive advantages — all while maintaining trust and compliance.

  • Zai Node avoids the expense, cost uncertainty, wastage and lack of visibility with the public cloud.

Deployment compare.csv
Feature On-prem Our datacenter Your datacenter
Data control Full data control Managed data security; we never store model inputs or outputs Full data control in your DC; managed data security
Data residency requirements Contained in country Multi-region support with global deployment options Region-locked data and deployments with multi-region support
Compute capacity Customized to order Customized to order Leverage existing resources or Zai Node compute for overflow
Cost efficiency High Cost-effective, on-demand compute Use in-house compute whenever available for optimized costs
Cost certainty Fixed Fixed and flexible Fixed and flexible
Integration with internal systems Custom or out-of-the-box integrations Easy integration via ecosystem Custom or out-of-the-box integrations
Performance optimization On-chip model performance On-chip model performance and low network latency On-chip model performance and low network latency
Scalability High, tailored scalability High, flexible scaling options High, flexible scaling options
Security and compliance Organizational policies SOC 2 Type II, HIPAA, and GDPR compliant Organizational policies
Support and Maintenance Comprehensive support and managed services Comprehensive support and managed services Comprehensive support and managed services
Utilization of existing cloud commits Spend down existing Spend down existing Use credits or commits
Made with HTML Tables