Michael Zucchi

 B.E. (Comp. Sys. Eng.)

  also known as zed
  & handle of notzed


android (44)
beagle (63)
biographical (103)
blogz (9)
business (1)
code (74)
compilerz (1)
cooking (31)
dez (7)
dusk (31)
extensionz (1)
ffts (3)
forth (3)
free software (4)
games (32)
gloat (2)
globalisation (1)
gnu (4)
graphics (16)
gsoc (4)
hacking (455)
haiku (2)
horticulture (10)
house (23)
hsa (6)
humour (7)
imagez (28)
java (231)
java ee (3)
javafx (49)
jjmpeg (81)
junk (3)
kobo (15)
libeze (7)
linux (5)
mediaz (27)
ml (15)
nativez (10)
opencl (120)
os (17)
panamaz (5)
parallella (97)
pdfz (8)
philosophy (26)
picfx (2)
players (1)
playerz (2)
politics (7)
ps3 (12)
puppybits (17)
rants (137)
readerz (8)
rez (1)
socles (36)
termz (3)
videoz (6)
vulkan (3)
wanki (3)
workshop (3)
zcl (4)
zedzone (24)
Sunday, 19 November 2017, 13:02

I hate my life.

And I wish I was dead.

Sunday, 19 November 2017, 02:52

I hate peecees

Well I was up till 3am fucking around with this bloody machine.

After verifying the hardware actually works it seems that the whole problem with my RAM not being found is the damn BIOS. I downloaded a bunch of BIOSs intending to try an older one and realised I hadn't installed the latest anyway. So I dropped than in and low and behold the memory came back. Yay.

So now I had that, I thought i'd try and get OpenCL working again. Ok, installed the latest (17.40) amdgpu-pro ... and fuck. Unsupported something or other in the kernel module. Sigh. I discovered that 17.30 did apparently work ... but it took a bit of digging to find a download link for it as AMD doesn't link to older releases (at this point i also found out midori has utterly shithouse network code. sigh). I finally found the right one (that wasn't corrupt), installed it and finally ...

Oh, back to 3.5G ram again. FAAARK.

At this point the power went out for the whole street so I had a shower using my phone's torch and went to bed.

Did some more digging when I got up (I was going to give up and say fuck it, but i've come this far), tried manually adding the ram using memmap, and finally confirmed the problem was the BIOS. So i tried an older one. That worked.

But only for a while, and then that broke as well. So trying the previous one ... groan.

Mabye it's time to cut my losses, it's already 1pm, the sun is out, heading for a lovely 31 degrees.

I also got rocm installed but i don't know if it works on this hardware, although at least the kernel is running ok so far.

Tagged junk, rants.
Saturday, 18 November 2017, 09:58

io scheduler, jfs, ssd

I didn't have much to do today and came across some articles about jfs and io schedulers and thought i'd run a few tests while polishing off a bottle of red (Noons Twleve Bells 2011). Actually i'd been waiting for a so-called "mate" to show up but he decided he was too busy to even let me know until after the day was over. Not like i've been suicidally depressed this week or anything.

The test I chose was to compile open jdk 9 which is a pretty i/o intensive and complex build.

My hardware is a Kaveri A10-7850K APU, "4G" of memory (sigh) on an ITX motherboard, and a Samsung SSD 840 EVO 250GB drive. For each test I blew away the build directory, re-ran configure, set the scheduler "echo foo > /sys/block/sda/queue/scheduler", and flushed the buffer caches "echo 3 > /proc/sys/vm/drop_caches" before running "time make images". I forced all cpu cores to a fixed 3.0Ghz using the performance scheduler.

At first I kept getting ICE's in the default compiler so I installed gcc-4.9 ... and got ICE's again. Inconsistently though, and we all know what ICE's in gcc means (it almost always means broken hardware, i.e. RAM). Sigh. It was afer a suspend-resume so that might've had something to do with it.

Well I rebooted and fudged around in the BIOS intending to play with the RAM clock but noticed I'd set the voltage too low. So what the hell I set it 1.60v and rebooted. And whattyaknow, suddenly I have 8G ram working again (for now), first time in a year. Fucking PCs.

Anyway so I ran the tests with each i/o scheduler on the filesystem and had no ICE's this time (which was dissapointing because it means the hardware isn't likely to be busted afterall, just temperamental). I used the default compiler. As it was a fresh reboot I ran builds from the same 80x24 xterm and the only other luser applications running beyond the panels and window manager were 1 other root exterm and an emacs to record the results.


real    10m39.440s
user    28m55.256s
sys     2m40.140s


real    10m36.500s
user    28m44.236s
sys     2m40.788s


real    10m42.683s
user    28m43.960s
sys     2m41.036s

As expected from the articles i'd read, deadline (not the default in ubuntu!) is the best. But it's hardly a big deal. Some articles also suggested using noop for SSD storage but that is "clearly" worse (oddly that it increases sys time if it doesn't do anything). I only ran one test each in the order shown so it's only an illustrative result but TBH I'm just not that interested especially if the resutls are so close anyway. If there was something more significant I might care.

I guess i'll switch to deadline anyway, why not.

No doubt much of the compilation ran from buffer cache - infact this is over 2x faster than when I compiled it on Slackware64 a few days ago - at the time I only had 4G system ram (less the framebuffer) and a few fat apps running. But I also had different bios settings (slower ram speed) and btrfs rather than jfs.

As an aside I remember in the early days of Evolution (2000-2-x) when it took 45 minutes for a complete build on a desktop tower, and the code wasn't even very big at the time. That dell piece of crap had woefully shitful i/o. Header files really kill C compilation performance though.

Still openjdk is pretty big, and i'm using a pretty woeful CPU here as well. Steamroller, lol. Where on earth are the ryzen apu's amd??

notzed@minized:~/hg/jdk9u$ find . -type f \
  -a \( -name '*.java' -o -name '*.cpp' -o -name '*.c' \) \
   | xargs grep \; | wc -l

(some of the c and c++ is platform specific and not built but it's a good enough measure).

Tagged junk.
Friday, 17 November 2017, 22:57

Midori user style

Well I found a way to make Midori usable for me as a browser-of-text using the user stylesheet thing.

Took some theme and stripped out the crap and came up with this ... it turns all the text readable but leaves most of the rest intact which is an improvement on how firefox rendered it's colour and stye overrides. It just overrode everything which broke a lot of style-sheet driven GUI toolkits amongst other sins.

* {
    color: #000 !important;
    text-shadow: 0 0 0px #000 !important;
    box-shadow: none !important;
    background-color: #777 !important;
    border-color: #000 !important;
    border-top-color: #000 !important;
    border-bottom-color: #000 !important;
    border-left-color: #000 !important;
    border-right-color: #000 !important;

div, body {
    background: transparent !important;

a, a * {
    color: #002255 !important;
    text-decoration: none !important;

a:hover, a:hover *, a:visited:hover, a:visited:hover *, span[onclick]:hover, div[onclick]:hover, [role="link"]:hover, [role="link"]:hover *, [role="button"]:hover *, [role="menuitem"]:hover, [role="menuitem"]:hover *, .link:hover, .link:hover * {
    color: #005522 !important;
    text-decoration: none !important;

a:visited, a:visited * {
    color: #550022 !important;

I don't know if i'll stick with it yet but it's a contender, its built-in javascript blocker looks a lot better than some shitty plugin which will break every ``upgrade'' too. It doesn't seem to run javascript terribly fast, but that's the language's fault for being so shithouse.

Friday, 17 November 2017, 11:32

Silent Spring

Well I finally got tired of Slackware64 - it worked quite well until I upgraded it to 14.2 some weeks ago, and then too much stuff broke - all I could get working was the VESA driver which meant I was back to a single screen, no opencl either. I ended up putting in xubuntu lts (16.4).

The machine is kinda busted anyway one of the DIMM slots doesn't work - the BIOS detects it but Linux throws the memory away for some reason. This happened months ago but I thought i'd give it another go but of course made no progress. Before it vanished it would sometimes show up and then crash if I used too much memory so I think the motherboard slot is shot (either DIMM works ok in slot 0, neither in 1). I'm in no rush to replace it, although 4G is a bit tight at times but it's not like I do much on it other than waste my life browsing shit I don't care about, with the occasional bit of pointless coding thrown in.

I probably wouldn't have bothered with unbuntu (and i really don't like it or other debians as a whole) but I somehow corrupted the BTRFS filesystem (I think modifying it from a live boot ubuntu which necessarily has a different build - it's been solid for years) and completely lost it. Well I was modifying it throwing away junk before I backed it up so I had just backed it up. I think that's my last experiment with BTRFS, i'm back to jfs now.

At least ubuntu installed quickly and easily and all, not too much hassle apart from the boot config. I setup an EFI boot partition which ubuntu automatically placed in /boot/efi but I didn't create a /boot partition where grub expects everything like the jfs module it needs to read the modules directory on the root partition. Sigh. Well after multiple attempts (easily enough booting via the live usb stick) I eventually worked out the grub command lines to install everything in the /boot/efi partition and have it work (I haven't used grub in years and that was pre-grub 2).

Then I had to go through the ritual of removing all the junk and installing useful stuff. Fuckoff pulseaudio, networkmanager, gnome-software (I mean what the fuck is this shit?), the auto updater, install emacs and dev packages, jvm's, netbeans. Then had to go through and fix the colours and fonts in the tools I use, fix the WM theme, etc. Fortunately i've installed 2 other machines not too long ago so it wasn't so painful this time.

And the stupid gtk3 scrollbars - something I noticed earlier but thought was just firefox at fault. Apparently scrollbars are hard to get right. I can only imagine that being the case if you also have trouble putting underpants on in the morning.

Firefox 57 is kinda shit. Infact I'm having to use `Midori' to write this because I can't even login to google from firefox because the login window simply doesn't work. And with the new plugin system it means my most useful plugin - one that toggles the javascript enable flag - is broken. You can get sort of get the same functionality from the new plugins but because of the way it works any site still thinks javascript is still available so might present a different page than if it wasn't; i.e. it just breaks more shit. I dunno, i'll see on that one, i'm reading less and less lately anyway so it might not matter. The whole browser is pretty bloody fugly too, like someone published and early mockup made in a spreadsheet tool.

And apparently firefox dropped alsa support some time ago, I thought the upgrade to Slackware64 14.2 had just broken something else. Actually I kind of like that computers just shut the fuck up (and don't catch you out by going beep) unless they're asked to make noise so it's no big deal. youtube is a bit crap, and particularly shit in a browser, and there isn't really any other valid reason for sound out of a browser.

The only reason I stick with firefox is because you can override the default colours and fonts completely (black on white gives me a fucking headache, as doe tiny or shitty typefaces - black on grey, DejaVu @ 9+ pix everywhere please), and disable javascript (it just makes so much of the web so much cleaner and faster). It also used to work well enough on the few sites I regularly use.

Sigh, i've had a troubled week.

Tagged linux, rants.
Sunday, 05 November 2017, 04:21

JNI and garbage collection

I've started on an article about creating garbage collectible JNI objects. This is based on the system used in zcl but simplified further for reuse by using the class object as the type specifier and binding release via static declared methods.

This also supports `safe' explicit release which may be required in some circumstances where the gc is not run often enough.

It should work well with the JVM as it uses reference queues and no finalize methods. It requires minimal "extra" application support - just a class specific release() method.

Read it here.

Tagged code, hacking, java.
Sunday, 05 November 2017, 02:52

java 9

Yesterday I had a quick look at java 9 - i hadn't installed it earlier as I was waiting for GA and I didn't really have a need. After a long silent spell I don't get many hits these days to this site so I don't know if anyone will read this but whatevers.

I guess the main new thing is the module system. It probably has some warts but overall it looks quite decent. maven and `aficionados' of other modularisation systems seem to be upset about some things with it but I mean, maven?

I did a bit of playing with zcl to see how it could be modularised. At least on paper it's a very good fit for this project due to the native code used. In practice it seems a little clumsy, at least at my first attempt.

I decided to separate it into two parts - the main reusable library and the tools package. This required adding another top-level directory for each module (as the module name is used by the compiler), and a couple of simple module-info.java files.


module au.notzed.zcl { // module name
    exports au.notzed.zcl; // package name

    requires java.logging; // module name
module au.notzed.zcl.tools { // module name
    exports au.notzed.zcl.tools; // package name

    requires au.notzed.zcl; // module name

With the source moved from src/au to au.notzed.zcl/au or au.notzed.zcl.tools/au as appropriate (sigh, yuck). Note that the name is enforced and must match the module name although the rest of the structure and where the module name exists in the path is quite flexible; here i obviously chose the simplest/shortest possible because I much prefer it that way.

The filenames in the Makefile were updated and a tiny change added is all that is needed to create both java modules at once. Makefile

zcl_JAVAC_FLAGS=-h build/include/zcl --module-source-path .

Yes I also moved to using java -h to create the jni header files as I noticed javah is now deprecated.

Ok that was easy. Now what?

The next part is to create a jmod file. This can be platform specific and include native libraries (and other resources - although i'm not sure how flexible that is).

The manual commands are fairly simple. After i've had a bit of play with it I will incorporate it into java.make with a new _jmods target mechanism.

build/jmods/zcl.jmod: build/zcl_built
  -rm $@
  mkdir -p build/jmods
  jmod create \
    --class-path build/zcl/au.notzed.zcl \
    --libs jni/bin/gnu-amd64/lib \
    --target-platform linux-amd64 \
    --module-version $(zcl_VERSION) \

As an aside it's nice to see them finally moving to gnu-style command switches.

Now this is where thing kind of get weird and I had a little misreading of the documentation at first (as a further aside I must say the documentation for jdk 9 is not up to it's normal standards at all, it doesn't even come with man pages). While a modularised jar can be used like any other at runtime whilst adding the benefits of encapsulation and dependency checking a jmod really only has one purpose - to create a custom JRE instance. As such I initially went the jmod route for both zcl and zcl.tools and found it was a bit clumsy to use (generating a whole jre for a test app? at least it was "only" 45MB!). The whole idea seems to somewhat fight against the 'write once run anywhere' aspect of java, even though i've had to create the same functionality separately for delivering desktop applications myself. For example it would be nice if jlink could be used to create a multi-platform distribution package for your components without including the jre as well (i.e. lib/linux-amd64, lib/windows-amd64 for native libraries and package up all the jars etc), but I guess that isn't the purpose of the tool and it will be useful for me nevertheless. There is definitely some merit to precise versioning of validated software (aka configuration management) although these days with regular security updates it spreads the task of keeping up to date a bit further out.

One nice thing is that the module path is actually a path of directories and not modules. No need to add every single jar file to the classpath, you just dump them in a directory. This wasn't possible with the classpath because the classpath was what defined the dependencies (albeit in a pretty loose way).


As I only cross-compile my software for toy platforms I was also curious how this was supposed to work ...

I found a question/answer on stack overflow that stated unequivocally that jlink was platform specific and that was that. This is incorrect. jlink is platform agnostic and you must supply the location of the system JDK jmods. The only problem is right now the only microsoft windows jdk available is an executable installer and no tar is available, one only hopes this is a temporary situation. So the binary must first be run inside some microsoft windows instance and then I believe I can just copy the files around. Maybe it will work in wine, but either way I haven't checked this yet.


One issue I do see as a free software developer is that once you jlink your modules, they are no longer editable (or cross platform). By design the modules are stored in an undocumented and platform-specific binary format. So here's the question ... how does this affect the GNU General Public License, and particularly the GNU Lesser General Public License? The former perhaps isn't much different from any statically linked binary -because GPL means all source must be GPL compatible. But in the case of the LGPL it is possible to link with non-GPL components - only in the case that the LGPL components may be replaced by the receiver of the software. In dynamic linking this can be achieved by ensuring all LGPL components are isolated in their own library and simply changing the load path or library file, but for static linking this requires that all object files are available for re-linking (hah lol, like anyone gives a shit, but that's the contract). So anyone distributing a binary will have to distribute all the modules that were used to build it together with the source of the LGPL modules. Yeah I can see that happening. Or perhaps the module path might be enough, and there is a mechanism for patching (albeit intended for development purposes).

Still, I think an article may be required from The Free Software Foundation to clarify the new java situation.

It would also be nice if jlink has support for bundling source which addresses much of the GPL distribution issue. Obviously it does because the jdk itself includes it's own source-code but it gets put into lib/ which seems an odd place to put it (again i don't know how flexible this structure is although it appears to be quite limited).

Update: Ahh covered by the classpath exception I guess, at least for the JRE. If one builds a custom JRE though any third party modules would also require the classpath exception? Or would including the source and all modules used to create the JRE suffice? Hmmm.


I would guess that the modularisation will have slow uptake because it's quite a big change and locks your code into java 9+, and it may evolve a little over the next jdk or two. I'm in two minds about using it myself just yet because of this reason and also because NetBeans has failed to deliver any support for it so far that i can tell (I was dissapointed to see Oracle abandon NetBeans to apache which is most probably part of it, and they're too busy changing license headers to get any real work done). There will also likely be blowback from those invested in existing systems, merits or not. And then there's dealing with the fuckup of a situation that android/"java" is in.

I myself will poke around with it for a while and merge the functionality into java.make, it's actually a pretty close fit to everything i've done (which is a nice validation that my solution wasn't far off) and will simplify it even if i might have to make a few minor changes like platform names.

Its a pretty good fit for my work but will require a bit of setup so I wont rush into it (a couple dozen lines of shell is doing a good enough job for me). Besides it's probably worth getting some experience with it before committing to a particular design. The NetBeans situation will also be a bit of a blocker and i'll probably wait for 9 to be released first.

Tagged java.
Saturday, 09 September 2017, 04:53

zcl 0.6

Yes it still lives. I've just uploaded an update to zcl

A bunch of bugfixes, new build system, more robustness, and OpenCL 2.1 support.

There are still some thing i'm experimenting with - primarily the functional/task stuff as it's just not flexible enough - but it's stable and robust and easy to work with so i'm no longer using JOCL for anything at work.

On a personal note I still haven't really gotten back into hacking and i had a short sojourn into facebookland so i haven't had much to write about. It's mostly been work, very poor sleep, and drinking! Oh and I started wearing kilts ...

Tagged java, opencl.
Newer Posts | Older Posts
Copyright (C) 2019 Michael Zucchi, All Rights Reserved. Powered by gcc & me!