Monitor APIs, microservices, SQL results and (much) more.
Thanks to AI, setting up dashboards takes a fraction of the time needed when using traditional monitoring systems or dashboard frameworks. Monitored data (JSON, SQL results, command output) can be sent directly, skipping the tedious step of parsing into individual metrics or a proprietary format.
Define any alerting logic using Javascript.
It's incredibly easy to monitor various things using Examples:
Rails query results
API response
Microservice health check
Cron job
Unit test results
CPU, disk, memory
SQL results
Ad-hoc data (Python)
Ad-hoc data (Unix)
require 'rest_client'

articles = Article.all

# Send JSON representation of results '', articles.to_json, :params => {:key => '9fxvMi8aR3CZ5BsNj0rt0odW'}
# Directly send JSON API response.
curl | curl --user 9fxvMi8aR3CZ5BsNj0rt0odW: -XPOST --data-binary @-

# If you need to monitor not only the content of an API response, but also meta-data like status codes or headers, use the 'moniqueio curl' command.
moniqueio curl -XPOST --data 'key1=val1' | curl --user 9fxvMi8aR3CZ5BsNj0rt0odW: -XPOST --data-binary @-
var restler = require('restler');

setInterval(function() {
    # Tags identify a microservice instance. They allow auto-creating dashboards with a tile created for each microservice instance. 
    var tags = ['microservice:search', 'pid:' +];

    # Send a status report containing memory usage data.
    restler.postJson('', { 
            status: 'ok',
            # Arbitrary JSON-serializable data can be included in the sent data.
            memory: process.memoryUsage()
        }, { query: {
            tags: tags.join(','),
            key: '9fxvMi8aR3CZ5BsNj0rt0odW'
}, 60000);
# 'moniqueio run' runs a command and outputs a JSON report summarizing the run, which can be submitted to the API.
10 0 * * *    moniqueio run | curl --user 9fxvMi8aR3CZ5BsNj0rt0odW: --request POST --data-binary @-

# Tags can be used to automatically create dashboard tiles for each job name (see sample dashboard below).
10 0 * * *    moniqueio run | curl --user 9fxvMi8aR3CZ5BsNj0rt0odW: --request POST --data-binary @-
# Run unit tests, parse results using the moniqueio tool, send it to API.

*/30 * * * * 2>&1 | moniqueio unittest_summarize | curl --user 9fxvMi8aR3CZ5BsNj0rt0odW: --request POST --data-binary @-
# Send log lines containing WARN or ERROR. The 'newcontent' command outputs lines that appeared since the previous invocation. The 'format=single' query parameter assures the sent data is treated as a single piece of content.

*/5 * * * *    moniqueio newcontent /home/ubuntu/app.log | grep 'WARN|ERROR' | curl --user 9fxvMi8aR3CZ5BsNj0rt0odW: --request POST --data-binary @- ''
# Send system-level reports. The '--tag-ip' option tags sent reports with an IP address, making possible auto-creating dashboards for multiple IPs.

*/15 * * * * moniqueio --api-key 9fxvMi8aR3CZ5BsNj0rt0odW sysreports --tag-ip
# Send PostgreSQL results
$ psql -c "SELECT country, COUNT(*) FROM user GROUP BY country" | curl --user 9fxvMi8aR3CZ5BsNj0rt0odW: --request POST --data-binary @-

# Send MySQL results
$ mysql -e "SELECT country, COUNT(*) FROM user GROUP BY country" | curl --user 9fxvMi8aR3CZ5BsNj0rt0odW: --request POST --data-binary @-

# Send Oracle results
$ echo "SELECT * FROM EMPLOYEE;" | sqlplus -s HR/oracle | curl --user 9fxvMi8aR3CZ5BsNj0rt0odW: -XPOST --data-binary @-

# Send MongoDB results
$ mongo --quiet --eval 'JSON.stringify( db.restaurants.find().limit(10).toArray() )' | curl --user 9fxvMi8aR3CZ5BsNj0rt0odW: -XPOST --data-binary @-
import requests
import json

# send a single number
num = 86'', params={'key': '9fxvMi8aR3CZ5BsNj0rt0odW'}, data=str(num))

# send a list of numbers
num_list = [86, 43, 32]'', params={'key': '9fxvMi8aR3CZ5BsNj0rt0odW'}, data=json.dumps(num_list))

# send a dict
d = {'x': 86, 'y': 43, 'z': 32}'', params={'key': '9fxvMi8aR3CZ5BsNj0rt0odW'}, data=json.dumps(d))

# send a nested dict
nd = {'x': {'nums': [1, 2], 'message': 'OK'}, 'y': 3.42}'', params={'key': '9fxvMi8aR3CZ5BsNj0rt0odW'}, data=json.dumps(nd))

# send data from string
cdata = """
us 120
uk 34
de 27
it -
"""'', params={'key': '9fxvMi8aR3CZ5BsNj0rt0odW'}, data=cdata)
# Send kernel version (input doesn't need to be numeric - in general, parsed content can be any JSON value)
$ uname -r | curl --user 9fxvMi8aR3CZ5BsNj0rt0odW: --request POST --data-binary @-

# Send memory usage data
$ cat /proc/meminfo | curl --user 9fxvMi8aR3CZ5BsNj0rt0odW: --request POST --data-binary @-

# Directly send 'docker' command output
$ docker stats | curl --user 9fxvMi8aR3CZ5BsNj0rt0odW: -XPOST --data-binary @-

# Directly send 'aws' command output
$ aws ec2 describe-instances | curl --user 9fxvMi8aR3CZ5BsNj0rt0odW: -XPOST --data-binary @-

# Send disk status coming from 'smartctl' tabular output
$ sudo smartctl -A /dev/sda | curl --user 9fxvMi8aR3CZ5BsNj0rt0odW: --request POST --data-binary @-

# Send last 10 lines of logs, without parsing the input into a table
$ dmesg | tail -10 | curl --user 9fxvMi8aR3CZ5BsNj0rt0odW: --request POST --data-binary @- ''

Sample dashboard
See live read-only view of the dashboard (obtained using dashboard sharing function)
Examples of sending data into are boring — whether it's a JSON document, SQL results or a text file, the same HTTP POST call handles the processing, even without specifying an input format. The data is automatically parsed into a tabular format and immediately available for graphing and alerting (for further details, check the introductory blog post Replace AWK and Bash scripts with AI).

Adding charts to a dashboard is very easy — just click the wanted values.

(8-second video)
The sent data is available to Javascript code that can perform tests and trigger alarms. The "dry run" mode and print calls make the code easy to investigate and debug.

Using Javascript gives freedom to define any alerting logic, and isn't limited to checking numerical values only. Advanced features like webhook integration or querying data from multiple reports are available.
Defining alarms in Javascript
TIME (-)
The reduction in the time needed to get dashboards with meaningful data is in most cases very big — days to hours and hours to minutes is not unusual. Creating plugins for traditional monitoring systems, integrating with dashboard frameworks, postprocessing data are tedious tasks that are largely not needed when using
It's so easy to push data from various relevant sources, like databases and APIs, that you will end up having a lot more meaningful information compared to using traditional tools. It's good to know what's really going on with your product.
It's nice when CPU usage is monitored by an automated system, but what can disrupt your sleep are mostly errors that an automated system is unable to detect — they require application-level checks . is a fantastic tool to implement them with minimal effort.
Start Free Trial

(or just see pricing)