Tips & tricks for installing and running ICS products

WebSphere liberty docker on Synology NAS

Tom Bosmans  21 June 2018 16:44:55
I've got a Synology DS415+ at home, and have Docker running on it.  I needed a quick way to install a WebSphere liberty server, and since the Synology NAS support Docker containers, why not ...  It's very easy to get up and running, you just need a few extra configuration settings.

Please note that I'm not sure if this would work on any Synology, though.  I think you need a Synology that has an Intel CPU (mine is an INTEL Atom C2538)  ...


Install the Docker package on your synology nas using the Package Center .

Image:WebSphere liberty docker on Synology NAS

Start Docker once it's installed.  In the Registry, you can search for "liberty".  Use the "Download" button to download the image .

The synology uses Docker hub, and it's this version you want to download :

There's more information there, for instance how to handle your certificates and key databases .  

Image:WebSphere liberty docker on Synology NAS

Once the download of the image is complete, select Liberty and click "Launch".  This creates an actual container from the image.

Image:WebSphere liberty docker on Synology NAS

You can then configure the container.  In particular, what needs to be configured are the volumes, and the ports .
Since the Docker container cannot be edited, you need volumes to save data between restarts .

Image:WebSphere liberty docker on Synology NAS

These 3 volumes are needed for the following paths :

/opt/ibm/wlp/output/ (or, more precisely, the path that's in the WLP_OUTPUT_DIR variable)

/logs  (or , more precisely, the path in the LOG_DIR variable)

The documentation states you just need /logs and /config, but I found that the first path is also necessary.

You can also choose to do this later , by using the "Edit" button:
This is my Volume configuration

Image:WebSphere liberty docker on Synology NAS

The ports , by default , are set to Automatic.  This means that they change after every restart, and that's not very handy.
I choose the ports 19080 and 19443 for the http and https ports respectively.

Image:WebSphere liberty docker on Synology NAS

This concludes the configuration for the Docker container .  

Configure Liberty

To get a meaningful Liberty server, you probably want to deploy your own configuration .
Using the File Station in Synology, I have the following folder structure (that contains the volume configuration of the container).

Image:WebSphere liberty docker on Synology NAS

In the config directory, the magic happens.   As with a "normal" Liberty installation, you have a server.xml file here (that is empty by default).
There's also an "apps" directory , that contains your ear files.

In my case, I've used a simple configuration that you can download here : server.xml

Image:WebSphere liberty docker on Synology NAS

This configuration contains a basic user registry, an LTPA configuration and has 2 applications installed : the adminCenter and the defaultApplication.ear (Snoop)
There are some specific steps to take , before everything will work :

SSL configuration

When you start the docker image, a default key configuration is generated.  You can of course use your own key database , but I choose the quick and easy solution.

Open the keystore.xml file that's in config/configDropins/defaults .  Use the password for the defaultKeyStore in the keystore parameter in your own server.xml.  

<keyStore id="defaultKeyStore" password="<replace with your keystore.xml password>" />


There's multiple ways to install the adminCenter, this is the method I followed :

Click on "Details" , with the websphere-liberty container selected.  
Switch to the "Terminal" tab .
Click on "Create" to create a new Bash terminal session .

Image:WebSphere liberty docker on Synology NAS

Use the following commands to install the adminCenter :

root@websphere-liberty1:/# cd /opt/ibm/wlp/bin                                                                          
root@websphere-liberty1:/opt/ibm/wlp/bin# ./installUtility install adminCenter-1.0      

After restarting the Docker container, the adminCenter is available on the following url : https://:19443/adminCenter .
Image:WebSphere liberty docker on Synology NAS

You need to log in using the admin user (if using the server.xml that's provided here, the password is : Passw0rd ) .

Image:WebSphere liberty docker on Synology NAS

More information on the adminCenter application can be found here :

Default Application

WebSphere Application server comes out of the box with a DefaultApplication (aka snoop), that is handy to see if your server is working correctly,  Now unfortunately, there is no DefaultApplication.ear that comes with Liberty.
This version DefaultApplication.ear works with Liberty .

So download this file, and upload it to your Synology, in the "apps" directory.  Your Liberty server will install it automatically (or restart the Docker image , so the server.xml also becomes active).

The Snoop Servlet is then available on https://:19443/snoop  .  You do need to login (if you use the server.xml that's provided here)

Image:WebSphere liberty docker on Synology NAS

Log files

The log that's in the "Detail" page is not very useful.  
Image:WebSphere liberty docker on Synology NAS
Fortunately, you can use File Station the Synology to access the "log" directory, where the standard messages.log is (and the other log files , like ffdc logs, if you're interested in those)

How to convert Notes names to email addresses in the Notes client.

Tom Bosmans  12 June 2018 10:55:51
A golden oldie ...  I recently had to generate a list of email addresses to use in a Cloud application, based on a mail group in my personal addressbook.

The names in that group are in the Notes format, obviously, and I need the email address.

Now I didn't have my Designer ready, nor did I feel like accessing the addressbooks directly .   And since this actually was a question from a colleague of mine, with no Notes knowledge at all, I needed to find something that works in a regular Notes client.  

So I remembered that Notes includes a built-in @Formula tester.  If you put your @formula code in any (regular) field, and press SHIFT-F9, the content of the field is interpreted as Formula language and executed.

The solution :

Create a colon ( : )  separated list of formulas (@namelookup) for all the Notes addresses you have.  If these contain the @ part (Notes specific routing information), strip that off.  You can easily do that in a spreadsheet, or in a text editor.

I end up with a list of formulas , looking like this :

@NameLookup([Noupdate];"Tom Bosmanst/GWBASICS";"InternetAddress"):        
@NameLookup([Noupdate];"Joske Vermeulen/GWBASICS";"InternetAddress")

(The limit here is the max. size of a text field in Notes, which is about 900 entries.  I had to process about 500 , so not a problem)

Then, I open a new mail and paste the formulas in the "Subject" field.
Image:How to convert Notes names to email addresses in the Notes client.

Select the Subject field, Press "SHIFT-F9" , and the formulas will be executed.  The result is a list of email addresses .  

VMWare Workstation command line

Tom Bosmans  18 May 2018 10:11:38
Get a list of running virtual machines
vmrun list

Use that output to get the ip address of that guest.
vmrun getGuestIPAddress  /run/media/tbosmans/fast/Connections_6_ICEC/Connections_6.vmx

(note that this particular call is pretty buggy, and does not return the correct IP address if you have multiple interfaces : ...  still, it can be pretty useful)

You can run any command in the guest, as long as you authenticate properly (-gu -gp )

So for instance, this command lists all running processes, and you can use the output to actually do something with these processes in a next step (eg. kill them)

vmrun -gu root -gp listProcessesInGuest  /run/media/tbosmans/fast/Connections_6_ICEC/Connections_6.vmx

You can also run any command using that mechanism.

IBM Connections Communities Replay events DB2 queries

Tom Bosmans  23 January 2018 17:33:25
Working on a recent problem where events are not processed, I was looking at the wsadmin commands to provide information.
The Jython code supplied , for instance for CommunitiesQEventService.viewQueuedEventsByRemoteAppDefId("Blog", None, 100) is pretty useless in situations where you have 100's of 1000's of events in the queue.  The Jython code in the wiki is also plain wrong (but that's a differentl story)

So I turned to the DB2 database, to examine the LC_EVENT_REPLAY table.  Unfortunately, the interesting detailed infomration is stored as an XML in a field (CLOB) called EVENT.
It took me quite a bit of time to figure out how to get the information out of that field in an SQL Query.

In fact, the most puzzling fact, was the notation needed for the XML root element and the node elements.  They all need to use the namespace.  Using a wildcard for the namespace , is sufficient in this case.
So this query would give you some detailed information about events in the replay table :

XMLTABLE( '$tev/*:entry'
    "title"          VARCHAR(512) PATH '*:title/text()',
        "author"        VARCHAR(128) PATH '*:author/*:email/text()',
        "communityid"        VARCHAR(128) PATH '*:container/@id',
        "community"        VARCHAR(128) PATH '*:container/@name'
    ) AS X
ORDER BY X."communityid";

Of course, you can show any information from the EVENT XML file you like, but using this query as a start, would help you immensely :-) .

Custom dynamic dns on Ubiquity router with

Tom Bosmans  5 January 2018 16:57:04

Ubiquity Edgerouter X

The Ubiquity Edgerouter X is a very cheap but very powerful router with a lot of options.  It's based on EdgeOS, which is a linux based distro.
That basically allows you to do "anything" you want.

I got it from Alternate ( , around 54 Euros....  

Dynamic DNS

I would like to finally setup a vpn solution, so I can safely access my systems from whereever.  My Edgerouter X has these capabilities, so I was looking for a way to set it up.

The first thing to do, is look for a Dynamic DNS provider.  In the past, I used (long, looong ago), but they don't offer dynamic dns services anymore as far as I can tell.
I looked a several free Dynamic DNS providers, but couldn't figure them out (it's probably me) .  

So I went looking what my 'real' dns provider has to offer (  .  It turns out, there is a dynamic dns service recently (27th december 2017) .

Dynamic DNS on

Really simple to do : the UI has a new section 'dynamic dns', where you add a new subdomain.  That subdomain is then listed in your regular subdomains.
I did seem to have problems when using longer passwords, but that may have been a differnt problem ...

More information :

Dynamic DNS configuration on Edgerouter


The Edgerouter uses a pretty standard ddclient package .  

Web UI

Through the web ui, the options are limited.  Specifically, the protocol, is limited to a subset of what ddclient has to offer, although the Service says "custom" ...

Image:Custom dynamic dns on Ubiquity router with
Bottomline, it doesn't work , and is not as "custom" as I would like.


The Edgerouter allows ssh access, I have configured it to use ssh keys for me .

There is a series of commands to configure the dynamic dns feature (like in the web ui), but although that offers a bit more options, it's still not sufficient.

Custom ddclient

Luckily, ddclient is just a simple perl script, so it's easy to modify.   The problem with the code is that it contains hardcoded elements (like the /update.php? part in the update part)
There's 3 sections to change :
- variables
- examples
- update code

I copied the code from the duckdns sections and adapted it.

Open ddclient with a text editor, as root (sudo su - ).  The ddclient file is here :


Add keysystems definitions at the end of the %services section (after woima, in my case) :

   'woima' => {
       'updateable' => undef,
       'update'     => \&nic_woima_update,
       'examples'   => \&nic_woima_examples,
       'variables'  => merge(
   'keysystems' => {
       'updateable' => undef,

       'update'     => \&nic_keysystems_update,

       'examples'   => \&nic_keysystems_examples,

       'variables'  => merge(





Add the variables to the %variables object  (somewhere at the end is fine):

'keysystems-common-defaults'       => {

                       'server'              => setv(T_FQDNP,  1, 0, 1, '', undef),

                       'login'               => setv(T_LOGIN,  0, 0, 0, 'unused',            undef),


Copy the example code and update code to he end of the file .

## nic_keysystems_examples
sub nic_keysystems_examples {
   return < o 'keysystems'

The 'keysystems' protocol is used by the non-free
dynamic DNS service offered by and
Check for API

Configuration variables applicable to the 'keysystems' protocol are:
 protocol=keysystems               ##
 server=www.fqdn.of.service   ## defaults to
 password=service-password    ## password (token) registered with the service         ## the host registered with the service.

Example ${program}.conf file entries:
 ## single host update
 protocol=keysystems,                                       \\
 password=prettypassword                    \\


## nic_keysystems_update
## by Tom Bosmans
## response contains "code 200" on succesfull completion
sub nic_keysystems_update {
   debug("\nnic_keysystems_update -------------------");

   ## update each configured host
   ## should improve to update in one pass
   foreach my $h (@_) {
       my $ip = delete $config{$h}{'wantip'};
       info("KEYSYSTEMS setting IP address to %s for %s", $ip, $h);
       verbose("UPDATE:","updating %s", $h);

       # Set the URL that we're going to to update
       my $url;
       $url  = "http://$config{$h}{'server'}/update.php";
       $url .= "?hostname=";
       $url .= $h;
       $url .= "&password=";
       $url .= $config{$h}{'password'};
       $url .= "&ip=";
       $url .= $ip;
       # Try to get URL
       my $reply = geturl(opt('proxy'), $url);

       # No response, declare as failed
       if (!defined($reply) || !$reply) {
           failed("KEYSYSTEMS updating %s: Could not connect to %s.", $h, $config{$h}{'server'});
       last if !header_ok($h, $reply);

       if ($reply =~ /code = 200/)
               $config{$h}{'ip'}     = $ip;
               $config{$h}{'mtime'}  = $now;
               $config{$h}{'status'} = 'good';
               success("updating %s: good: IP address set to %s", $h, $ip);
               $config{$h}{'status'} = 'failed';
               failed("updating %s: Server said: '$reply'", $h);

Save the file and restart the ddclient service.

sudo service ddclient restart

This just checks if the code is fine.   Now the configuraiton.

We need 2 files:


Note that you can generate the second file, by using the webui of Edgerouter, or the console commands .  The values in the webui or console command don't matter, you will delete everything anyway.
You need to edit these files as root (sudo su - )

/etc/ddclient.conf :

# Configuration file for ddclient generated by debconf
# /etc/ddclient.conf



The important variables here are the password , and the last line, your hostname you defined in the Domaindiscount24 web interface.

# autogenerated by on Fri Jan  5 12:58:19 UTC 2018
use=if, if=eth0


Save both files.

You can now force an update of the ddns, but issuing a EdgeOS command :

update dns dynamic interface eth0

You can put a tail on the messages log, to see the results :

tail -f /var/log/messages

The result should be something like this :

Jan  5 15:20:06 ubnt ddclient[10616]: SUCCESS:  updating good: IP address set to
Jan  5 16:39:02 ubnt ddclient[13381]: SUCCESS:  updating good: IP address set to

Of course, instead of editing the files directly on your router, you could actually copy them using scp .... and editing them on your own desktop machine .


Alas, no supportability.  EdgeOS updates will likely wipe the changes away.,
Also, using the webui or console to update the dynamic dns settings, will wreak havoc on the configuration.  I am working on getting the updates in Source forge (  / ), but don't hold your breath for these changes to make it all the way down to Ubiquity .
So the solution is not ideal, but it works for now ...

Trying out Domino data services with Chart.js

Tom Bosmans  4 December 2017 11:06:48
Domino Data Access Services have been around for a few years now, but I never actually used them myself.

Since I recently started to dabble in Ethereum mining, I was looking for a place to store my data and draw some graphs and the likes.  I first tried out LibreOffice Calc, but I couldn't find an easy way to automatically update it with data from a REST API.  
So I turned to good old Domino, being the grandpa of NoSQL databases (before it was cool).

The solution I came up with, retrieves multiple JSON streams from various sources, combines it into a single JSON , that is then uploaded into a Domino database (using Python).
To look at the data, I created a literal "SPA" (single page application) - I use a Page in Domino to run Javascript code , to retrieve the data , again in JSON format, and turn it into a nice graph (using charts.js) .
So I don't actually use any Domino code to display anything, Domino is simply used to store and manage the data.

This article consists of 2 parts :

  • loading of data into Domino using Python and REST services.
  • displaying data from Domino using the Domino Data Access Services and an open-source javascript library to display charts ( )

Python to Domino

Domino preparation

To use the Domino Data Access services in a database, you need to enable them

  • On the server
  • In the Database properties (Allow Domino Data Service)
  • In the View properties

Server configuration

Open the internet site document for the server/site you are interested in.
In the Configuration tab, scroll down to the "Domino Acces Services"  .  Enable "Data" here.

Note that you may want to verify the enabled methods as well - enable PUT if you plan to use the services that use PUT requests.
And if you're not use Internet Site documents yet, well, then I can't help you :-)

After modifying the Internet Site document, you need to restart the HTTP task on your Domino server.
Image:Trying out Domino data services with Chart.js

Database properties

In the Advanced properties, select "Views and Documents" for the "Allow Domino Data Service" option.
Image:Trying out Domino data services with Chart.js

View properties

Open the View properties. and on the second to last tab, enable "Allow Domino Data Service operations".
Image:Trying out Domino data services with Chart.js

There is no equivalent option in Forms.

Python code

Instead of figuring out how to load JSON data in a Notes agent or Xpages (which no doubt is possible, but seems a lot of work), I choose to use a simple Python script, that I kick of using a cron job. I run this code collocated with the Domino server, but that is not necessary .  Because the POST requires authentication, and the url used it using TLS, this could just as well run anywhere else.
Any other server-side code would do the same thing , so Node.js or Perl or ... are all valid options.

There's 2 JSON  objects being retrieved :

resultseth = requests.get('{wallet}&email={email address}')
data = resultseth.json()


currentprice = requests.get(',USD,EUR')
pricedata = currentprice.json()

The first JSON that's returned , contains nested data (the workers object) .

"autopayout_from": "1.0",
"earning_24_hours": "0.1123",
"error": false,
"immature_earning": 0.000890178102,
"last_payment_amount": "1.0",
"last_payment_date": "Thu, 16 Nov 2017 16:24:01 GMT",
"last_share_date": "Mon, 04 Dec 2017 12:41:33 GMT",
"payout_daily": true,
"payout_request": false,
"total_hashrate": 30,
"total_hashrate_calculated": 31,
"transferring_to_balance": 0.0155,
"wallet": "0x5ac81ec3457a71dda2af0e15688d04da9a98df3c",
"wallet_balance": "5411",
"workers": {
"worker1": {
"alive": true,
"hashrate": 15,
"hashrate_below_threshold": false,
"hashrate_calculated": 16,
"last_submit": "Mon, 04 Dec 2017 12:38:42 GMT",
"second_since_submit": 587,
"worker": "worker1"

"worker2": {
"alive": true,
"hashrate": 15,
"hashrate_below_threshold": false,
"hashrate_calculated": 16,
"last_submit": "Mon, 04 Dec 2017 11:38:42 GMT",
"second_since_submit": 111,
"worker": "worker2"

It turns out that Domino does not like that very much or rather cannot handle nested JSON, but there is a simple solution - flatten the JSON.

This uses the "flatten_json" class in Python, so it easy to use.  

In the sample above, it would translate

{ "workers":
{ "worker1":
  "worker": "worker1"


{workers_worker1_worker: "worker1"}

(Information about this particular API , is here )

The flatten_json, can be installed using pip

pip install flatten_json

From a public API , I can get the current price of ETH expressed in EUR, dollars and Bitcoin.

In Python, I now have 2 dictionary objects , with the JSON data (key - value pairs)
I combine them into a a single one, by adding the data of the 2nd dictionary to the first.

for lines in pricedata:
   data[lines] = pricedata[lines]

The nice thing about this Python classes, is that it allows to dynamically edit the JSON before submitting it again.  I could remove the data I don't want, for instance.
In this case, I need to do something about the boolean values that get returned by the Dwarfpool API, because Domino Data Access Services does not like them!

for lines in data:
   print lines,data[lines]        
   if data[lines] ==  True:
           data[lines] = "True"
   if data[lines] == False:
           data[lines] = "False"

The next step is to post the JSON file to Domino.
It's very straightforward : The url used will create a new Notes document, based on the Form named "Data" .  ( )

The Domino Form needs to exist of course, but it's not very important that the fields are on there .  

url = ''

There's some headers to set, in particular "Content-Type" must be set , to "application/json"

To Authenticate, I use a Basic Authentication header .  In this case, the user I authenticate with , only has Depositor access to the Database (which is the first time in 20 years of Domino experience  , I see the point in having this role in an ACL :-)  )

The service responds with an HTTP Code 201, if everything went correctly .  This is of course something you can work with (if the response code does not match 201, do something to notify the administrator, for instance) .

The full script:

# retrieves dwarfpool data for my wallet
# retrieves current price ETH
# merges the 2 in a flattened JSON
# uploads the JSON into a Domino database using the domino rest api
import requests
import json
from flatten_json import flatten

resultseth = requests.get('<wallet>&email=<email address>')
data = resultseth.json()
print "-----------------"

# retrieve eth price
currentprice = requests.get(',USD,EUR')
pricedata = currentprice.json()

print "------------------"
data = flatten(data)

# merge json data
for lines in pricedata:
   data[lines] = pricedata[lines]

for lines in data:
   print lines,data[lines]        
   if data[lines] ==  True:
           data[lines] = "True"
   if data[lines] == False:
           data[lines] = "False"

url = ''
myheaders = {'Content-Type': 'application/json'}
authentication = ("<Depositor userid>", "<password>")
response =, data=json.dumps(data), headers=myheaders, auth=authentication)
print response.status_code

Lessons learned

  • The Domino DAS are fast and easy to use , from Python .
  • The Domino Data Access Services POST requests do not handle nested JSON, so you need to first massage your JSON into a flat format .
  • The Domino DAS is pretty picky about the types - it does not support Boolean values (true/false)
  • Finally, I have seen a good use of the Depositor role in action !

Chart.js and Domino

Now the data is in Domino, and we can start thinking about

The Single Page Application

I created a Page in Domino, and put all HTML and Javascript on that page as pass-tru HTML.

Having the code in Domino has the advantage that the Domino security model is used.  So I need to authenticate first , to be able to use the SPA.
The same code can live anywhere else (eg. as a html page on any webserver),  but then I'd have to worry about authenticating the Ajax calls that retrieve the data.  
I set the Page to be the "Homepage" of the Database .

I use several javascript libraries, Jquery and Chart.js.  

For Chart.js, there's several ways to include the code, I chose  to use a Content Delivery Network ( )

<script src="" integrity="sha256-vyehT44mCOPZg7SbqfOZ0HNYXjPKgBCaqxBkW3lh6bg=" crossorigin="anonymous"></script>

For Jquery, I learned that the "slim" version does not have the JSON libraries, so use the minimized or full version.


Chart.js is a simple charting engine, that is easy to use and apparently also very commonly used.
I did have problems getting it to work correctly with my Domino Data, but that turned out to be related to Domino, not to Chart.js.

The samples that are out there for Chart.js generally do not include dynamic data, so here's how to use dynamic data from Chart.js using Domino.


What worked best for me,  is to initialize the Chart in the $.document.ready function.  Without Jquery, you can do the same with window.onload .

The chart is stored in a global variable, myChart, so it is accessible from everywhere.

The trick here, is to initialize the Chart's data and labels as empty arrays.  The arrays will be loaded with data in the next step (the title is also dynamic, you may notice).

In this sample, I have 2 datasets, and only at the end of this function, I call the first load of the data (updateChartData)

<script language="JavaScript" type="text/javascript">
var pageNumber = 0;
var pageSize = 24;
var myChart = {};
// prepare chart with an empty array for data within the datasets
// 2 datasets, 1 for EUR , 1 for ETH
$(document).ready(function() {
   // remove data button needs to be disabled when we start .
   document.getElementById('removeData').disabled = true;
   var ctx = document.getElementById("canvas").getContext("2d");
   myChart = new Chart(ctx, {
   type: 'line',
                   labels: [],
                   datasets: [
                                                   label: "EURO",
                                           data: [],
                                           borderColor: '#ff6384',
                                           yAxisID: "y-axis-eur"
                                           label: "ETH",
                                           data: [],
                                           borderColor: '#36a2eb',
                                           yAxisID: "y-axis-eth"
   options:  {
                   responsive: true,
                   animation: {         easing: 'easeInOutCubic',
                                   duration: 200,
                   tooltips: {
                                               mode: 'index',
                                                  intersect: false,
           hover: {
                               mode: 'nearest',
                               intersect: true
                     scales: {
               xAxes: [{
                   display: true,
                   scaleLabel: {
                       display: true,
                       labelString: 'History'
                 yAxes: [{
                           type: "linear",
                   display: true,
                   position: "left",
                   id: "y-axis-eth",
           // grid line settings
                   gridLines: {
                       drawOnChartArea: false, // only want the grid lines for one axis to show up
               }, {
                   type: "linear",
                   display: true,
                   position: "right",
                   id: "y-axis-eur",

Load data

The getJSON call (Jquery) connects to the Domino view. and gives 3 parameters :
- pagesize -  set to 24 to retrieve the last 24 documents (there is a document generated every hour by the Python cron job)
- page number  - set the paging - initially set to 0.
- systemcolums = 0 - avoids Domino specific data being returned (data that we'll not use anyway in this scneario)

The JSON that is retrieved from the Domino view is now loaded into an array of objects, that we can loop through.

The Chart data is directly accessible :
Labels :
Dataset 1 :[0].data
Dataset 2 :[1].data

The last call , myChart.Update, updates the Chart and redraws the chart.

var updateChartData = function(ps,pn) {
           type: "GET",
   myChart.options.title =  {                 display:true,
                                   text: 'Last 24 hour performance - ' + $, "d MMM yyyy HH:mm")
   $.getJSON("/dev/dataservices.nsf/api/data/collections/name/GraphData?systemcolumns=0&ps="+ps+"&page="+pn, function(data){
           console.log(" Loading page " + pn + " with pagesize " + ps + " returned " + data.length + " entries");;
           for (var i=0; i < data.length; i++) {
                   //console.log( " index: " + i + "  EUR : " + data[i].TOTAL_VALUE_IN_EUR );
           //shift to delete first element in arrays, not necessary in this case

This is the end result :
Image:Trying out Domino data services with Chart.js


To code the buttons, I used an EventListener (copied from the Chart.js samples : )
However , they did not work as expected initially.

On every click, the whole page reloaded - this is not what you want in a Single Page Application !

To counter that, I added the "e" in the function to pass the Event handler , and then use preventDefault,  to avoid reloading of the page.

$( "#addData" ).click(function(e) {
    // --------- prevent page from reloading ------

    // ----
   console.log( " Retrieving page : " + pageNumber );
   updateChartData(pageSize, pageNumber);
   document.getElementById('removeData').disabled = false;

Without Jquery, it would look like this (it needs some additional code for cross browser compatibiiltiy).
The first line is there for cross-browser compatibiltiy (Firefox does not know window.event, that is actually an ugly IE hack).

document.getElementById('addData').addEventListener('click', function(e) {
    if(!e){ e = window.event; } ;

   console.log( " Retrieving page : " + pageNumber );
   updateChartData(pageSize, pageNumber);
   document.getElementById('removeData').disabled = false;

Only after I made that change, I realized that this behaviour was in  fact caused by Domino, and that disabling the Database propery "Use Javascript when generating pages" would fix this.
Why our Domino developers ever thought it was a good idea to put HTML forms in Pages, I will never understand (I understand they used this in Forms).

And in my testing, I still needed the preventDefault, even with the Database property set .....

Some after the fact googling, suggests to me that using preventDefault is in fact the way to go (eg. )

Lessons learned

  • Using a Domino Page to host the Javascript code, enables the Domino security model .
  • I forgot about the Domino quirks with regards to web applications (e.preventDefault)
  • $.getJSON can be set up using $.ajaxSetup , although it's not necessary.
  • I didn't find good Chart.js samples for dynamic loading of data.

Since we're talking Ethereum, you may of course donate here :-)  0x5ac81ec3457a71dda2af0e15688d04da9a98df3c

    Check limits on open files for running processes

    Tom Bosmans  10 November 2017 17:02:41
    OK, setting the correct limits in /etc/sysconfig/limits.conf, and messing around with ulimit can leave you thinking everything is ok, while it is not.
    This little line shows you an overview of all the running java processes, to quickly check the Open File limit is correct .

    check the limits (open files) for all running java processes
    (as root)

    for i in $(pgrep java); do prlimit -p $i|grep NOFILE; done

    In this example, you see that there's just 2 of the jvm's are running with the correct limits.  The easiest way to resolve this (if  /etc/sysconfig/limits.conf is correct, and you have a service that starts your nodeagent) , is to reboot :

    NOFILE     max number of open files               65536     65536
    NOFILE     max number of open files               65536     65536
    NOFILE     max number of open files                1024      4096
    NOFILE     max number of open files                1024      4096
    NOFILE     max number of open files                1024      4096
    NOFILE     max number of open files                1024      4096
    NOFILE     max number of open files                1024      4096
    NOFILE     max number of open files                1024      4096
    NOFILE     max number of open files                1024      4096
    NOFILE     max number of open files                1024      4096
    NOFILE     max number of open files                1024      4096
    NOFILE     max number of open files                1024      4096
    NOFILE     max number of open files                1024      4096
    NOFILE     max number of open files                1024      4096
    NOFILE     max number of open files                1024      4096
    NOFILE     max number of open files                1024      4096
    NOFILE     max number of open files                1024      4096
    NOFILE     max number of open files                1024      4096
    NOFILE     max number of open files                1024      4096

    DKIM deployed on my mail servers

    Tom Bosmans  16 June 2017 10:40:42
    After moving my server to a new physical box (and new IP Address), some of the more difficult large mail systems started rejecting mail from my domains.
    Google was OK with my mails, although not ecstatic, but Yahoo and especially Microsoft considered my systems dangerous apparently.

    I googled around, found a lot of crap information, but resolved the issue and improved my mail setup in the end.  Turned out that I should be using TLS (for secure smtp) and DKIM (DomainKeys Identified Mail - )

    The bad stuff

    - There's a lot of links advising you to use Return Path (ao. here :
    Don't invest time here.  It's a service for spammers, I would say (they call it "email marketing").  You need to register and likely never get a response anyway.  
    - Domino does not support DKIM natively, and likely never will (
    - Microsoft (with all their domains -,, ...) are very tricky
    - Yahoo is difficult as well, but should you care ?  You shouldn't be using Yahoo mail anyway these days.
    - MailScanner breaks DKIM, so requires changes in the configuration (the problem being that it
    It's a little tricky to find out all the details - because most test tools identify that "dkim is working", while google complains ....
    - Postfix works with Letsencrypt certificates, but again , the information on the internet is sometimes incorrect or incomplete at best.
    - DKIM relies on DNS configuration, and that can be tricky (depending on your DNS provider or your DNS server)

    The good information

    - Postfix support DKIM through the opendkim milter add-on (
    - testing DKIM can be done using a tool like this  :  
    Very handy, fast, easy, no registration.
    - the proof is in the pudding, and sending mail to (Google) actually shows the information nice and tidy.
    - Letsencrypt and Postfix work together nicely once the setup is done correctly.

    Let's get to work

    So what I had to do, in a nutshell :

    • Change my Domino configuration , so also send outgoing mail through Postfix.  This is as simple as setting the "Relay host for messages leaving the local internet domain".
      This is necessary, to allow opendkim to sign the outgoing mails as well.
      Relay host for messages leaving the local internet domain:

    • Configure Postfix - add the milter for dkim (and configure TLS with LetsEncrypt) in
    • Configure MailScanner  - apply the settings that are in the configuration file, that mention dkim.
    • Configure opendkim (generate the keys)
    • Configure DNS (create a new TXT record for the key you created.  In general, you can use "default", and you require a record for default._domainkey. )
    • Verify your key using opendkim-testkey
    • Test the DNS entry (eg. using , or using host (eg. host -t txt
    • Test the mails you send out (use  ).  Or use gmail to check.

    Use Gmail to check your settings

    Gmail actually has the possibly by default to verify various settings.  
    Next to the "to me", click the dropdown button.
    In the case that you have set up DKIM correctly, it will show a "signed-by" line.  You can see TLS information here as well .
    Image:DKIM deployed on my mail servers
    Additionally, you can also go to "Show original"
    Image:DKIM deployed on my mail servers
    This will show the source of the  email, and has a summary header that contains important information.
    As you can see  , it shows that DKIM has PASS.  If it says something else here, you need to go back to the drawing board.
    Image:DKIM deployed on my mail servers

    This can contain a lot more options, btw.  If you use DMARC as well, it will show up here too.  For my domain, you see the SPF option.

    Microsoft's domains

    Once you're certain DNS is setup correctly and you're no open relay, you can easily contact Microsoft directly to unblock your mail server(s) here :
    This immediatly works for, and the other domains.

    This took only a few hours in my case.

    Server outage (disk failure)

    Tom Bosmans  6 June 2017 10:08:04
    Yesterday morning, I noticed that my server was running slow .   I couldn't see any processes hugging up resources, though.

    Instead of really looking into the problem, I decided to reboot the machine .  That was a mistake.  As the server did not come back online, I realised that it was likely that there was a problem with the disks .
    I have a dedicated server at , and it's really the first time I run into problems .  I can really recommend this hosting provider.

    The server has a software raid with 2 disks , running Cent OS.  
    I assumed that mdadm was trying to recover , but had no way of knowing, since the machine did not come back online.  
    At this point, I got very scared - I feared loss of data.

    Fortunately, the guys at hetzner supply a self-service console to the machine (you start a rescue system).

    I could log in using that mechanism, and then I was able to mount the filesystems in raid.  It was quickly clear that indeed, 1 disk died.

    Now I could do 2 things :
    - request a disk replacement.  This was going to take a while, and during that time I don't have a redundant disk.  And chances are high, when 1 disk fails , the other will also fail.
    - move my installation to a new server.  I know that between ordering a new server, and having the OS installed on it ready for use, only takes around 1 hour (did I mention these guys are great ?  Note that this is physical hardware, not some cloud service  !)

    I decided to go with option 2 .

    This consists of copying the data from the old server to the new one (this took a long time), reinstalling the software , reapplying the configuration for my mail servers and other stuff, and then adjusting the Domino configuration (change the ip addresses).

    In the end, it took me 10 hours in all, to get the new server up and running...including copying the data.   Now I just have to decommision the old server , and I'm done :-)

    Kubernetes and dns

    Tom Bosmans  28 April 2017 11:00:25
    Kubernetes apparently doesn't use a host file, but instead relies on DNS.  So when setting up Orient Me (for Connections 6) on a test environment, you may run into problems.

    Then you may want to look back to this older blog entry :
    Setup DNS Masq

    You're welcome :-)

    To keep with the docker mechanism, look at this to make your life easier :

    Note that this is obviously not the only solution,  you can also follow these instructions :