Alloy API to Postgres and SuperSet

Here is a method to connect Alloy direct to PostgreSQL and Apache Superset.

Within Alloy yo can create cards, which show the results of AQS queries within the system, this method will allow the use of these within PostgreSQL and also within Apache Superset for advanced dashboards.

And gives you access via SQL to the data. you can use any AQS to create the functions, I am just using cards as these are an automatically updating view of the data.

All the code attached here, you just need to populate your API key.

drop table if exists alloy_api_key cascade;

drop table if exists alloy_cards;

create table alloy_cards (
** card_datetime timestamp,**
** card_name text,**
** query_name text,**
** value double precision**
);

create table alloy_api_key ( alloy_api_key character varying(50));

insert into alloy_api_key(alloy_api_key) values(‘YOUR_API_KEY_HERE’);

create or replace function alloy_get_all_cards(alloy_api_key character varying(50) ) returns table (card_name text, query_name text, value double precision)
AS $$
import requests
import json

def get_alloy_cards():
** url = “https://api.uk.alloyapp.io/api/card?Context=Customer&token=” + alloy_api_key**
** headers = {‘content-type’: ‘application/json’}**
** try:**
** r = requests.get(url, headers=headers)**
** r.raise_for_status()**
** except requests.exceptions.HTTPError as err:**
** r = “failed upload”**
** return r**

def get_alloy_card_queries(card_name):
** url = “https://api.uk.alloyapp.io/api/card/” + card_name + “?token=” + alloy_api_key**
** headers = {‘content-type’: ‘application/json’}**
** try:**
** r = requests.get(url, headers=headers)**
** r.raise_for_status()**
** except requests.exceptions.HTTPError as err:**
** r = “failed upload”**
** return r**


results = []
r = get_alloy_cards()
rc = json.loads(r.content)
#print(“”)
for card in rc[“results”]:
** # print(card[“card”][“name”])**
** r2 = get_alloy_card_queries(card[“card”][“code”])**
** rc2 = json.loads(r2.content)**
** for query in rc2[“card”][“queries”]:**
** #print(" ", query[‘id’], " => ", query[“name”], " => ", query[“value”])**
** these_results = [card[“card”][“name”], query[“name”], query[“value”]]**
** results.append(these_results)**

** # query[‘id’]**


return(results)
$$ LANGUAGE plpython3u;

insert into alloy_cards(card_datetime,card_name,query_name,value)
** select now(),card_name,query_name,value**
** from alloy_get_all_cards( ( select alloy_api_key from alloy_api_key ) ) order by card_name, query_name;**

select * from alloy_cards;

3 Likes

Impressive stuff, great to see more people doing stuff with the API directly and integrations :heart:

Hi. Im not a professional developer, its even more impressive in Superset, as you can do time based monitoring of the cards… and zoom in to days,weeks, months etc.

1 Like

Looks great, just be aware of Removal of old API key mechanism 🔑 - Alloy announcements

You will have to generate a new API token (super easy)
But you will also need to move the token from the url query, and add it as a header.
“Authorization”: “Bearer {{alloy_api_key}}”

Hi.

Yes I know that, easy to modify…

headers = {‘Content-type’: ‘application/json’, ‘token’: alloy_api_key}

1 Like