Using the API to create integration with a new SQL Database

I want to use the OpenAPI to create a live feed that takes all information from the System and replicates to an SQL Azure Database via an endpoint.

The goal is to have a live SQL database on our Azure Platform that is constantly updated via the API with every new transaction.

Has anyone ever tried this?

Hi Jelani,

I’m sorry to say that our system is currently unable to handle live data streaming. Additionally, we would advise against attempting it due to a few reasons:

  • Alloy is optimized for efficiently processing operations on tens or hundreds of thousands of data points.
  • The cost of data ingress would likely be high from your perspective.
  • Technical requirements for synchronization are quite high and require failover/resiliency.
  • Mapping a NoSQL database to a Relational Database is challenging and could result in performance issues on the destination database if not properly transformed.

However, there are other ways to access the data you require from Alloy. To start, we recommend identifying the key information you need from the other data store. From there, we can suggest some Alloy features that might aid in your integration:

  • Nightly exports of the data you require.
  • Use our Sync API to access the data you need on a schedule (e.g., daily). If used correctly, you can also implement delta syncing to only take the differences.
  • Workflows with HTTP POST actions to push changes to specific designs to a destination where you can process the changes.

These are just a few of the solutions available, so we recommend reaching out to our support and implementation teams to explore what else can be achieved.

I hope this information helps as a starting point! Please let us know if you have any further questions or concerns.

1 Like

We are doing something similar to this. We have an Oracle Spatial database on premise. This is populated with assets and defects from Alloy. Our GIS solution feeds from the db and presents maps to customers who want to report defects. The maps are therefore up to date with new assets and live defects - those customers are then able to see if their defect has already been reported, thereby reducing duplication.

We have workflows which trigger on creation/deletion of assets and on creation/status-change of defects. They use HTTP POST actions to an endpoint on Azure with the ID of the item. We then query the API for the details of the Item and send the details to our Spatial db.

We don’t export Inspections or Jobs - instead we update the status of the Defect as work progresses. Workflows listen for changes to the Defect status only, so this keeps the number of transactions at a sensible level.

We decided against using HTTP POST with all attributes because these nodes have to be created via the API and the JSON gets quite complicated as each item has different attributes which have to be coded by hand. Just sending the ID and then querying the API makes things much simpler.