Chat Guidelines


These are my personal guidelines when using chat, I’m putting them in writing to avoid confusion. Some things like ping and afk might look strange to those that missed the IRC age.

It’s mostly for one on one chats but they work with group chat’s most of the time if you use common sense. Available at any large retailer near you.

To me a chat session is a permanent line between us, there are no goodbye’s just long AFK sessions like goodnight. It’s perfectly fine to suddenly stop in a conversation. It’s still there, if the other party really want’s a reply they can just send a question mark.

Working hours

During working hours my phone is on silent to keep my focus, I usually check for messages once every half our. So expect a delay. If I’m running pomodor’s then you will get a /afk – pomodoro from me meaning I’m away for 25 minutes.

The ping

Checking if you are there for friendly chat, just ping.

Me: ping
You: pong
Me: Hey I just figured out a large problem...

The magic .

It’s a lighter version of ping, mostly used to get a conversation going again when either party is distracted.

Me: <is typing for 15 minutes>
You: .
Me: Oh, sorry got distracted, so as I was saying....

The puzzled look or ?

When I send just a question mark there are two options, either you just wrote something I don’t understand and I would like you to rephrase it or go into detail or I asked something and didn’t get a reply yet and I would really like to know.

Me: I'm sapiosexual
You: ?
Me: To be aroused by intelligence

You: Do you have time this friday?
<time passes>
You: ?
Me: I'm a bit busy, I believe so will get back to it within the hour

Dealing with real life or AFK

The real world doesn’t care about our chat session, so if you suddenly need to pay attention to the real world just drop an AFK in the chat. Optionally you can put a reason behind it.

Me: /AFK - Meeting


Fixing “Can’t add an inotify watch of /etc/config/rsync_schedule.conf error!: No Space left on device” on QNAP-419P


So, this issue seems to come back on my QNAP when using rsync for backup. It seems to be a bug that returns but the main issue is that the system doesn’t allow enough inotify watches resulting in this error.

Best would be a good fix to avoid needing so much, but since we also want to get stuff done we are cheating our way out, by increasing the maximum.

First check the value on your current box, login as admin with ssh

# cat /proc/sys/fs/inotify/max_user_watches

Nice, but a bit low, you can quickly raise it by updating the file

# echo "655360" > /proc/sys/fs/inotify/max_user_watches
# cat /proc/sys/fs/inotify/max_user_watches

So that’s great, but on reboot it will reset. To make it permanent we will add it to the in the config partition. This example is for the P419+ but you might want to check the wiki about other versions.

# mkdir /tmp/config
# mount -t ext2 /dev/mtdblock5 /tmp/config
# vim /tmp/config/
## This is where you add the above line in a small shellscript
# chmod 755 /tmp/config/
# umount /tmp/config

Now reboot and check if the value remains but repeating the first step.


Generate true unique server-id for mysql using puppet


I found a few solutions on the net, but people just attach the last 2 digits and I would like to avoid issues like this where and would both produce server-id 111.

So to fix that, I seperate the IP and multiply the 3rd digit by a 1000.

$serverid = inline_template(‘<%= scope.lookupvar(\’::ipaddress\’).split(\’.\’)[2].to_i * 1000 + scope.lookupvar(\’::ipaddress\’).split(\’.\’)[3].to_i %>’)

Getting Portify to run on Ubuntu (13.04)


So, wife want’s to transfer from Spotify to Google Music. Luck would have it that there is a tool to do that.

Quick disclaimer,

  • These are the steps I did to get it to work on my Ubuntu 13.04 install
  • Portify isn’t fully stable, it stopped/hanged a couple of times

Preparing ubuntu

I’m building as my regular user, feel free to break/comprimise your system by running this is as root, I’m going to start with getting a current version of nodejs and build stuff

sudo apt-add-repository ppa:chris-lea/node.js
sudo apt-get update
sudo apt-get install -y nodejs build-essential

Fetching Source code

Once you got the basics setup, it’s time to get a workdir and the code.

# I work from a Projects folder, feel free to build in any location you like
cd ~/Projects
git clone
cd portify

Building the code

Now, the Readme tells me that npm install fetches all dependencies, this is a lie!, atleast for me. 🙂

cd ./data
npm install debug formidable methods
npm install superagent connect
npm install buffer-crc32 fresh range-parser
npm install module cookie-signature cookie send
npm install

This should be enough to get it to build

Running and accessing the program

Portify is a deamon that runs as a web application, so to get things going you start it up

cd ./data
node app.js

Now the only thing left to do is access it through your browser. So if it’s running, just click this link


The tool helped me enough to port most playlists from Spotify to Google Music, but it’s not always successfull. And installing is a bit tricky, hope this helps someone.




I love playing with autokey-gtk, as it saves me a lot of keystrokes and helps improve my workflow. It’s also the main reason I will not be switching to any other OS like ChromeOS

I’ll try and post a few of my scripts once in  awhile, it uses python code so sky’s the limit.

Copy text, filter out jira ticket and open in new tab

This script selects all text, copies it to clipboard and then searches for a jira ticket. Once found, open a new tab, and paste the Jira URL.

(Siteops tickets start with SO-, adjust as needed)

import time
import re
text = clipboard.get_clipboard()
m ="(SO-[0-9]+)", text)
if m:
 keyboard.send_keys("https://jira/browse/%s" %

GMail to or any other mail based task manager

This one is simple, copy the URL from the browser and forward the mail to my task manager with the Gmail url pasted into it, that way I can always get back to the original mail when working on the task.

# Script to copy URL and forward mail with URL included
import time

Building a Hackintosh!


So, I hate Apple with a passion. They make great hardware, but you are limited in choice and if something breaks you can bet your sorry ass that your system is under repair for a couple of weeks.  I run Ubuntu, thank you very much.

But, my wife Maaike Appels has to work on a Mac, Lightroom/Photoshop isn’t available for Linux and a requirement for a professional. So, for her I’ll put the hatred to rest and find a solution.

She has a Mac Mini, but clearly, the thing just isn’t coping with the usage. And those pictures are not getting any smaller I can tell you. The joys of tech.  What she needs is a Mac Pro, do you know what those things cost? 3000 euro, and then you get a whopping 8GB of ram. Nice, if you only need to edit your holiday pictures.

So, after much internal debate, the only thing that is left to build a true high end system, without selling the car is to make my own.


I already have a case, an old Dell Desktop that I’ll happily gut, as the mainboard never did play nice with Ubuntu installs. So I’ll replace what’s bad (Mainboard/Memory/CPU) and keep what’s ok (Case/Power/Burner)

Shopping List based on the Tony CustoMac Page.

  • Gigabyte GA-Z68MA-UD2H-B3 – mATX Mainboard – € 131,45
  • Intel Core i7 2700K/3.5Ghz – CPU – € 271,00
  • Sapphire Radeon HD 6850 – Videocard – € 121,00
  • Corsair XM3 Quad Channel 4x 8GB – 32GB Ram Total – € 217,00
  • OCZ SSD Agility 3 Series – 180GB SSD HD – 159,95

Total cost € 900,40 – Should run circles around a Mac Pro, and run OSX just fine


Get a copy of Lion from the App Store, one of the benefits of a mac being handy. As I never used the App store on the Mac Mini it had to be configured first. After typing in my password for the 10th time and searching for Lion about 8 times it allowed me to buy the software and start the download.

After this I followed the Unibeast instructies to create the USB Stick, this worked perfectly without any issues. And has helped a lot in making this install a lot easier.

Building the system

So, if you have ever build a computer system then this isn’t going to be that hard for you. Simple clean the case, insert hardware and make sure it’s hooked up properly. Really didn’t need to look up anything for this. But if you have problems with this I recommend finding a friend who has done this before.

You might notice I have the SSD non-attached here, was still waiting for a bracket for that one, and expecting an extension cable for the case fan tomorrow. But except for that everything is all here. And nothing is stopping me from atleast doing the install.

Installing the OS

Nothing I can tell here that you can not already find in the Unibeast Install guide, just take your time and don’t be afraid to make mistakes.

After installing the OS you need to run Multibeast to get the drivers configured, make sure you read beforehand how this works. I made a few critical mistakes and that made my system very unstable. The quickest fix if this happens is to just reinstall again. Don’t forget to wipe the disk before doing so.

To use Multibeast this guide  is essential, read it, then read it again and actually understand what’s doing what.

Make sure you understand what UserDSDT files are, and where to put them. (Hint on your desktop with a special name). They will make sure your system is running fine.

Once this is done, system should boot up fine and most things should be working.

Stuff I had to fix

Dual Screen with the ATI HD 6850 Card

So this required a little digging to solve, but it seems all related to framebuffer. In short I have to tell the bootloader to have GraphicsEnabler=no, simpelest way to do this is to just type it in during boot. (Really, just start typeing at the boot screen)

Once you are sure that it works fine you can update the boot config. A nice guide for picking the right framebuffer is this one

Boot no longer working

Seems I have broken my bootloader somewhere along the way, I’m still debugging this but for now using the USB stick to boot helps to get things going. Think I made a typo in a configuration file somewhere.

Overall Impression so far

Once it’s up and running you can get mighty impressed, I mean this beast is flying all over. Ofcourse, you do notice a few small cracks, there seem to be some stability issues when putting the system in sleep mode. And as Maaike discovered, Adobe isn’t to keen to help you out when you got issues.

Timemachine to the rescue

So, I totally broke everything last week. Not my best moment and one that only got solved because I had a timemachine backup on the QNAP already.

Select Time Machine Vault, pick last backup, restore…

Wait for an hour and presto, back to pre-break-tweaking.


  • Pick your hardware carefully
  • Take your time and read
  • Don’t be afraid to re-install
  • Make sure you have an external drive or QNAP to keep a time machine backup handy.
  • When updating boot configuration, make backups and don’t try to change to much at once.


Geekbench Score  11435  –

Not as good as I would have expected, but then again it was build to have ton’s of RAM and not CPU power. If you care about your system feeling snappy, get the SSD drive, get the extra RAM. You might not win speed tests, but everyday use will make you smile.


Super Simple Linux Health Monitoring with Uptime Robot and Nagios-Plugins


Uptime Robot is an excellent tool if you want to know if the site is up, but it won’t tell you much about system health. It’s either down or up and let’s face it, we much rather avoid the down. The professional solution would be to configure Nagios, Zabbix or any of the other tools to monitor the box, but it is a bit of overkill to setup a seperate box just to monitor your one or two VPS boxes.

So here is a super simple way to monitor system health, and should only take you 10 minutes to setup. I’m assuming Debian/Ubuntu here and some basic Linux skills, if you don’t know how to use chmod/ls/ln and other commands, than really, what are you doing here?

First thing we are going to do is install the nagios plugins, just the plugins. If you get hundreds of dependencies you are doing it wrong.

apt-get install nagios-plugins-basic

So, once this is done you have a nice toolkit of script to do quick monitoring. Now create a small shellscript to run all the things you would like to check, I’m running a nginx system with MySQL, so here is my script.

# Simple Disk Space Check
$NAGIOS/check_disk -w 75 -c 90

# Load Check
$NAGIOS/check_load -w 3,2,1 -c 6,4,2

# MySQL Running?
$NAGIOS/check_procs -w 1:1 -c 1:1 -C mysqld

# NGINX Running?
$NAGIOS/check_procs -w 2:10 -c 2:10 -C nginx

# PHP CGI Running?
$NAGIOS/check_procs -w 16:16 -c 10:22 -C php-cgi

Run the script and check if everything results into OK, if not, either adjust monitor of fix the issue.

Now we add the following to the cron

* * * * * root SCRIPT > WEBROOT/uptime_robot.txt

For added security, replace “uptime_robot” with a nice long random string, or even better a basic authentication password, whatever works for you.

For those paying attention, we now have a public available text file with the result of our checks in it. Now all we need to do is add it to Uptime Robot.

Pick keyword checking, and configure the URL where you generate your check file. Now all we need to do is create 2 checks, one for WARNING and one for CRITICAL. Update settings to suit your needs, like critical 24×7 and warning only during daytime.

That’s it, told you it would be easy.

Using a reverse proxy to access github from a limited datacenter.


Sometimes life doesn’t work out, you want to be able to push/pull your code to github but the machine you are working on doesn’t have internet access to do so and in my case  the receiving github server is internal so even with internet it would not have worked.

How to fix this with a reverse tunnel, add the following to your local ~/.ssh/config

Host example-host
  RemoteForward 12222

And on example-host, add this to the ~/.ssh/config

  hostname localhost
  port 12222

Now, I’m assuming you already have a straight line to the host you are working on, if not using ProxyCommand might solve that. And that you have AgentForwarding enabled because unprotected ssh keys are bad m’kay.

Now,  once this is setup, you ssh to the host and should be able to use it without voodoo on the prompt

git clone


Lightroom en QNAP


Soms is de oplossing eenvoudiger dan je zou denken. Mijn vrouw is fotograaf en heeft als gevolg daarvan behoorlijk wat foto’s in Lightroom staan. Dat is geen probleem totdat je harde schijf begint vol te raken.

Mijn eerste poging was om de oude foto’s te exporteren naar een nieuwe catalogus op de QNAP, maar ik kwam er al snel achter dat hoewel dit werkt, je vervolgens deze oude catalogus niet meer mag openen. De reden, hij staat op een netwerk schijf.

Dat heeft dus geen zin, het moet mogelijk zijn om bij je oude foto’s te kunnen. Als eerste geprobeerd om haar Mac te laten denken dat hij lokaal bezig was, maar dat was geen oplossing.

Nog wat verder zoeken kwam het eenvoudige antwoord, gebruik de “Mappen” van Lightroom, en dan is het kinderspel.

Je voegt een nieuwe map toe aan de catalogus, en verwijst die naar een gedeelde map op de QNAP. Vervolgens kan je vrolijk binnen lightroom foto’s van je lokale schijf naar de netwerk schijf sleuren en zelfs rechtstreeks importeren naar deze nieuwe folder.

Uiteindelijk blijkt dus de enigste eis te zijn dat het catalogus bestand zelf lokaal staat, de foto’s mogen overal staan.

Bijkomende bonus, ze kan nu door alle foto’s zoeken, ook de gene die al jaren geleden gearchiveerd zijn.

Kunnen we eigenlijk nog positief zijn over Windows?


Net klaar met mijn nieuwe presentatie, die is te vinden op en binnenkort op de website van de HCC.

Ik ben afgeweken van mijn normale patroon, en heb deze keer wel de min-punten van Windows belicht. Ubuntu word namelijk vaak vergeleken met een goed draaiende Windows, maar niemand heeft het over de weg die je moet bewandelen om daar te komen.

Windows gebruikers weten niet beter dan dat het normaal is dat je onderhoud moet doen, zoals anti-virus, firewalls, opschonen, defragmenteren, etc.   Ze vinden het ook normaal dat een groot deel van hun software om aandacht vraagt, bv omdat er een update of nieuwtje beschikbaar is.

Ik hoor ze soms klagen, maar het hoort erbij.   Todat ze een tijdje op een systeem zonder die onzin hebt gezeten, want dan zien ze dat het niet normaal is.

Veel mensen vragen me wat het voordeel is van Ubuntu en dat vond ik altijd moeilijk om uit te leggen. Maar tegenwoordig heb ik een antwoord, Ubuntu is een volwassen systeem dat voor zichzelf zorgt en jouw met rust laat tenzij je er om vraagt. Windows lijkt wat dat betreft meer op een klein kind, constant vragen om aandacht en onderhoud. Gezien de leeftijd van dat OS gok ik dat er iets mis is gegaan bij de opvoeding.

Mocht er iemand zijn die vind dat ik geen gelijk heb, ik daag je uit, ik neem standaard 9.04 Ubuntu CD en doe daar een maand mijn werk mee. Jij mag kiezen welke versie van Windows, maar het moet een standaard CD zijn (met servicepacks natuurlijk). Het enigste wat we overzetten is de “Documents en Settings”, en we noteren alles wat niet bij normaal gebruik hoort. Vergelijken we aan het eind wie het meeste tijd kwijt is. Vanzelf ga ik het evenement wel bloggen.

Ik denk dat je na instalatie al de handoek in de ring gooit…

Dat was het dan weer…