Aurum — Unify
Connect Everything.
Automate What Happens Next.
Unify is a self-hosted data integration and orchestration platform. Build pipelines that connect your databases, APIs, files, email, and message bus — with real business logic at every step.
Integration Without the Complexity Tax
Most organizations are running data across systems that were never meant to talk to each other — ERPs, databases, external APIs, file feeds, email workflows, message queues. Getting them to work together usually means custom code, fragile scripts, or expensive middleware platforms built for organizations ten times your size.
Unify is the alternative. A visual pipeline builder that connects your systems, enforces your business rules, and automates what happens when data moves between them. Self-hosted on your own infrastructure — no cloud dependency, no per-transaction fees, no vendor in the middle of your data.
How a Unify Pipeline Works
Every pipeline is built from three layers: inputs that trigger and feed the pipeline, data modules that do the work, and outputs that deliver results. Pipelines can have multiple inputs and multiple outputs — the architecture matches the complexity of real-world integrations.
A pipeline can be triggered and fed by any combination of:
- Intercept results — an Intercepts query detects a data exception and fires a Unify pipeline automatically as the response
- Inbound email — email received by the Unify server triggers and feeds data into the pipeline
- API calls — an external system or application calls Unify’s API to initiate a pipeline and pass in data
- Message bus events — events from Microsoft MQ, IBM MQ, or Azure Service Bus trigger pipeline execution
- Scheduled execution — pipelines can run on a defined schedule without an external trigger
Multiple input types can feed a single pipeline simultaneously — aggregate data from multiple sources before processing begins.
Between inputs and outputs, data moves through a chain of configurable modules. Each module has deep configuration options — this isn’t a simple pass-through. Modules can:
- Transform datasets — reshape, reformat, map, and normalize data between the structures different systems expect
- Enforce business rules — apply your organization’s logic to data in motion, not just at rest
- Validate data — check data against business logic before it moves downstream; stop or redirect the pipeline if validation fails
- Report exceptions — surface data anomalies detected mid-pipeline for review or automated response
- Log execution — record what happened at each step for auditing and debugging
- Capture data at specific points — data logging modules can be placed anywhere in the pipeline flow to capture exactly what’s in motion at that point. Useful for compliance, auditing, and diagnosing complex integrations.
- Trigger an Intercepts query — run an Intercept inline as part of a pipeline, feeding its results forward into the next module
Pipeline outputs can deliver results to any combination of:
- API calls — POST results to external systems, trigger webhooks, or return a response to the system that initiated the pipeline
- SQL operations — read from or write to SQL Server databases, including INSERT, UPDATE, and stored procedure execution
- Email — send formatted results, alerts, or reports to individuals or distribution lists
- Message bus — publish events to Microsoft MQ, IBM MQ, or Azure Service Bus for downstream consumers
- Web portal — surface results directly in a web interface, enabling real-time data delivery to end users
A single pipeline can write to multiple outputs simultaneously — update a database, notify a team by email, and publish a message bus event all in one execution.
Pipelines are built in a GUI that lets you see the full flow — inputs, modules, and outputs — at a glance. Each node in the pipeline is configurable through a deep set of options appropriate to what that node does.
The visual builder means your IT team can understand, review, and manage pipelines without reading code. The depth of configuration per node means there’s no ceiling on what a pipeline can do.
- See the entire pipeline flow in a single view
- Add, remove, and reorder nodes without rewriting anything
- Configure each node independently with purpose-built options
- Use Auric (Aurum’s built-in scripting language) for custom logic within any node
- Comprehensive execution logging — full history of what ran, when, and what it processed
A manufacturer sells through a North American distributor. The distributor has a defined API specification they require all suppliers to implement — XML-formatted HTTP requests and responses in a specific schema. The manufacturer’s inventory lives in their ERP system.
The Unify pipeline:
- Input: The distributor sends an XML-formatted HTTP POST to a URL provided by Unify — a real-time stock inquiry in the distributor’s required format
- Modules:
- Parse the incoming XML request and extract the item identifier
- Query the manufacturer’s ERP API in real time for current inventory levels and location
- Extract and transform the ERP response into the required fields
- Log the stock query — every request recorded with timestamp, item, and result
- Evaluate request frequency — if a particular item has been queried above a defined threshold, trigger an alert
- Outputs:
- Return current inventory level and location to the distributor, formatted to their exact API specification
- Send email notifications to process managers when an item’s request frequency exceeds the defined threshold
One pipeline handles the full B2B integration: inbound API translation, ERP query, audit logging, conditional alerting, and outbound response — all in real time, with no custom code to maintain.
When the distributor updates their API specification, update the relevant input and output nodes. The ERP query, logging, and alerting logic remain unchanged.
An enterprise with branch locations across multiple regions needs end users to query live inventory — but each branch runs a different ERP system.
The Unify pipeline:
- Input: A web form submits a stock inquiry via API call to Unify
- Modules: Unify fans out simultaneous API calls to each branch ERP system, collects and aggregates results, normalizes the data across different ERP response formats
- Output: Formatted results returned to the web portal and displayed to the end user in real time
One pipeline. No custom middleware per ERP. When a new branch location comes online with a different ERP, add a node — the rest of the pipeline stays unchanged.
Unify does not require Intercepts or Resolve. If your primary need is data integration and automation — connecting systems, moving data, enforcing business logic in transit — Unify is a complete solution on its own.
For organizations already using Intercepts, Unify closes the loop: Intercepts detects the exception, and Unify executes the automated response. Add Resolve to manage the business workflow in between.
- License Unify independently — $2,000/active pipeline/year
- Add Intercepts and Resolve at any time as your needs grow
- All products share the same infrastructure, the same Auric scripting layer, and the same web-based management interface
Pricing
Unify is licensed per active pipeline — $2,000 per pipeline per year. A pipeline is one defined workflow in your Unify environment. The number of inputs, modules, outputs, or systems a pipeline connects to does not affect the price.
No per-execution fees. No data volume limits. No cloud costs. Self-hosted on your Windows Server infrastructure.
See Full Pricing Request a DemoSee What Unify Can Do in Your Environment
Tell us about the systems you need to connect and we’ll walk through how a Unify pipeline would handle it.
Request a Demo
