Hmm, nearly been 3 months since last I poked a few bytes onto this blog. Sure has been a hot summer around here - nearly 2 straight weeks over 40 - topped 46 one day. After that, even 35 feels pretty comfortable. Fortunately no fires around here, but that might be more luck than anything - we've had no rain to speak of since December - which is reasonably uncommon even for this, the 'driest state in the driest continent', as the beer ad said. I had a good few weeks off over Christmas - and spent most of the time playing ps games or outside wrecking my back yard - only just starting to get back into hacking.
Work has been pretty dull - ongoing maintenance, documentation, bug fixing and testing - you know, all the great stuff you put off till you can't put it off any longer. Unfortunately, the bell tolls for me. At least i'm doing the doco using Lyx -- that's come along a bit since I last looked at it. Pretty comfortable environment for pumping out documentation, much better than any crappy word processor -- faster, easier, and better quality output. Although their 'WYSIWYM' mantra is bullshit - none of the character formatting is semantic based (c.f. texinfo for instance), and defining your own styles is overly complex.
On the home hacking front, after playing around with C for a little, i've been poking a bit around Java again. I like the simplicity, things just make sense. It's still a fresh of breath air after all that c-hash rubbish.
I thought i'd look at re-implementing the CMS i'd been poking around with in C for the last few months -- but not really going anywhere -- in Java. I was going to do all UML stuff just to give it a go, but the tools around all suck, and emacs and a bit of pseudo-code solved my problems more easily. Oh well, maybe later. The Berkeley DB Java Edition is really nice - maps very nicely to Java, so removes all the boiler-plate and marshalling I needed to do in C. Also simplifies the index setup by automating some of the fiddly details. So it was pretty trivial to re-implement the database part in Java. Although it certainly helped that i'd done it before and knew the schema I wanted. I also worked out a way to simplify the schema significantly, whilst keeping all of the major features of the design, from 5 tables to 3, and fewer tables to manually maintain. I looked through and old document I wrote about it previously and it didn't read particularly well, and is probably before I got the previous design working, so I might write the new one up one day.
Now i'm working on the texinfo parser bit. This needed doing from scratch since the C version used flex and I didn't want to copy it - it was needlessly complex and fragile anyway. And it turns out a hand-rolled parser and lexical analyser is really quite simple to write and should be easier to maintain and extend. I guess I may go back and convert it to C at some point as well, which should be trivial (a C version would be handy for me for various reasons). I've got most of the basic parser framework done, although I still need to implement output translation, and meta-data capture. Oh well, no rush.
Hmm, I woke up from a stupid nightmare 19 hours ago, how come I'm not sleepier?
Oh the economic side-show has been interesting so far, I wonder when we're getting to the main event? Idiot pollies are still going around here either making out everything's hunky-dory, or illogically blaming the other side for any problems we're facing or going to face. Japan's massive drop in export income might just be the action point that drives the fact home that we're helplessly at the mercy of the rest of the world's use of our expensive dirt for our own prosperity, and local labour laws or arguing about how much money is being splashed around will have little to do with any long-term outcome.
Swings and rounabouts.
Ok, back to a bit of hacking. There's something about spending all day chained to a windows xp desktop and a visual studio session that simply sucks the life out of you. Leaving little time or energy to pursue other hobbies or interests. But with the lengthening days and a lack of TV to watch, I've found time to look back at Java again.
The last time I did any Java was 1998-1999, so it's been a long time between drinks, so to speak. Although I have tried to like eclipse, I never really did get to like it. And the last time I went looking for plugins all the interesting looking ones (for what I was looking at) were out of date, or had nothing but dead links in them - so that put me off even bothering. So this time I tried netbeans - and by chance 6.5 had just been released.
Well, I was mostly impressed actually. Swing actually isn't that bad. And it's a breath of fresh air to have some decent documentation for a change after dredging through the crap for WPF. Or even to look at the source-code! Wow! Almost like the 'good old' GNOME days where I built everything from glib up locally so had a much better time of debugging. At first the GUI editor seemed as crappy as the WPF one - defaulting to using hard-coded layouts. But then I couldn't find any table or v/hbox/stackpanel's and thought the whole system was really stuffed. But then I discovered that the layout container can automatically snap to theme-specified offsets, and can align columns and rows, and actually aligns baselines rather than bounding boxes. Wow. I don't know if it'd be easy to use from code, but with the netbeans editor it is quite nice. Far better than gtk+'s (and WPF's) fixed-sized crap and having to worry about whether it's an inner or outer margin, it just does that automagically, and even scales for different themes. Nice. Anyway, after struggling with vbox/hbox/table (ugh), and then grid/stackpanel (sigh, even worse) for years, I instantly fell in love (I think it's a GroupLayout, or maybe it's a netbeans-specific AbsoluteLayout).
Oh, and it all runs quite snappy too. Not that it isn't a bit of a pig (but machines are so much bigger now - so it isn't really an issue), but UI apps seem just fine - infact netbeans is a lot faster than visual studio, in every way. A lot faster. I do wish you could set bitmap fonts though - I'd love to just use 'fixed' as my font.
I thought i'd miss 'delegates' - as a C coder I littered my code with function pointers all the time, I really couldn't get enough of them, and c-hash delegates are much the same (although I don't use them the same way). But using anonymous classes actually looks neater for some reason - it's basically what c-hash is doing 'under the bonnet', and often the delgate callbacks do so little they don't warrant their own function anyway. Properties are sort of nice - but they are only syntactic sugar, they don't actually do anything at all differently to what a getter and setter does - they provide no additional facilities automatically like property monitoring. c# events are quite nice though, again they are really only syntactic sugar, but they do save a lot of boilerplate code.
Apart from that, it struck me just how much like Java that c# really is. In true ms fashion all they did was copy and complicate. The simplicity of the Java language is nice.
Some of the libaries are also nice too. Simple. But others - you just wonder what people are thinking. The JSR for binding properties is awful. It looks complicated and messy to use - much of it in the name of 'type safety' - but then they go and use some scripting language for binding expressions that isn't checked till run-time anyway. Sigh. Everyone seems obsessed with standardising and formalising every little thing too. Sometimes the smaller simpler stuff can just be left to the implementor. Some of the 'design pattern' guys seem to have gone off the deep end - adding over-doses of complexity upon complexity.
The application at work is a single-user destkop application that uses a SQL RDBMS as a backend. One thing we've contemplated is moving it to a multi-user, server-based system. Now i would simply not consider using .NET to do this. So that was another reason to re-visit Java. Ahh j2ee 5. Well one thing that can be said for Java. You've got a lot to choose from, a lot of it is free, some of it is extremely good quality. It's really like another world completley from the closed money-grabbing greedy eco-system in the stagnant cesspit surrounding the outflow from microsoft. There just isn't any comparison - they're not even in the same league, perhaps not even the same sport.
I'm still trying to get my head around the persistence framework. I kinda liked the older Entity model, because things were coded directly, at least initially it seems easier to understand (or at least, I figured out how it worked at one point, although I never used it). The newer model does a few things unfamiliar to me, but i'm sure I could get used to it. Our .NET code uses a custom entity-like system and a lot of custom SQL (lots of messy joins), which would be quite difficult to move to the persistence query language, but most of it could be moved to views or stored procedures as well (and probably should be at that). I'd considered nhibernate, but it was a bit immature at the time, and quite frankly I didn't see the worth in investing all that time to learn another meta-language to re-describe what was already in the database (and I still dont - i'm glad as hell that j2ee 5 uses xml meta-descriptors sparingly).
The new EJB stuff is quite nice in some areas. Last time I worked on Java this stuff didn't even exist. We were using CORBA directly. The automatic CORBA servant generation from interfaces is ... well it's nice. But there are some `strange' limitations. Well they're not really strange - it forces a particular architecture which will scale transparently. But if you don't really need that it does limit things needlessly - like passing around client-hosted interfaces. Although facilities like JMS can often be used to implement the same sort of things anyway. JMS is nice.
One problem is that although all of these really nice facilities exist - it can be a real pain getting them to actually work. I was playing with JMS and even though I was using the 'correct' mechanism for sending messages, I was running out of connections. Bug in the app server perhaps? I'm not sure. Not knowing is a bit of a problem. And with a distributed application I was hoping to re-use the persistence objects remotely, but that doesn't really work. Ho hum, back to the (manual) grind of load--copy to mirror object--return, etc. In another case I tried changing a persistence object from being LAZY to EAGER loaded - it crashed with what seemed to me an interface mismatch between server and client. Couldn't work that one out. Actually in general netbeans+glassfish seems terribly unreliable at consistently rebuilding dependencies. Maybe i'm doing something wrong, but even with less than a dozen files I often have to run it twice, or shutdown everything and clean+build to get new code loaded (this is something that affects visual studio too).
I shall continue to tinker.
Sackboy, alias The DRM Kid
Ahh well, so my `worst fears' about the LittleBigPlanet (LBP) moderation system seem to have come to fruition. I knew when Sony decided to recall all copies because of a possible religious offence only days before a world-wide launch, at no doubt very high cost, we were in for a messy ride if not a complete cluster-fuck.
It seems LBP's moderation system is as harsh as it is limited. From what I can tell, levels get deleted from the server with no explanation for even the slightest infringement - you cannot even play your local copy if you are online, and they retain persistent identifiers so you cannot copy/edit and resubmit them either. There is no ratings system, so everything has to be child-friendly (why does the PS3 have a parental control setting then?), but also it means it has to be inoffensive to everyone - everywhere. Unfortunately, if you look hard enough you can always find someone who will find offence in whatever it is you are doing, no matter how commonplace or inoffensive it may be to you and your associates.
Then there are those nefarious so-called intellectual `property' laws. Trademarked, copyright, or even patented(!) items cannot be represented in the game. So I guess no one-click shopping levels! And these are only compounded by the difficulty in understanding what they actually mean and how they are applied - which might not even be possible if they are trying to comply with some sort of lowest-common-denominator of laws from all countries that have the PSN. For a family-oriented game, almost certainly to be used without supervision of legal counsel, the likely-hood of running afoul of these laws is quite high. And when their home-spun creation vanishes without explanation, I'm sure they'll let all of their friends know of their frustration and anger. Hey maybe they can get Fry in to do some more tutorials - teaching the general populace about the evils of copyright violation!? `Here's the DRM Kid, he'll protect us from those nasty pirates!' (rather ironic if it happened, given his recent piece extolling Free Software for The GNU Project's 25th anniversary).
Ahh, what a mess these laws have created for all of us - even those who wanted them.
But what about the game itself?
I actually bought LBP on Friday and spent a lot of the weekend playing it. I had considered the censorship issue and thought it would probably work itself out. Reading the current threads on the forums does make me feel less of the game, although of course it doesn't change the gameplay of the built-in levels.
It looks gorgeous, with the materials, textures and lighting, sound and effects all spot on (apart from the jump 'whoosh' sound, which I don't particularly care for). The level designs are varied and mostly interesting, and there are constant puzzle elements when trying to advance, or get to bonus items on a level. The controls are over-all pretty good, although since everything is physics based it can take a little getting used to compared to programmed behaviour. The 3 levels of depth is a bit fiddly at times, but it isn't a deal-breaker. It has some frustrations - checkpoints are limited so death can mean the repeat of an entire level. The `emote' functionality (happy/angry/sad/scared) is mostly pointless - usually you're too far away to even see it properly, and it sort of wobbles and flickers a bit when you can - which makes it look a bit silly. It also definitely isn't a young kids game, it is too difficult for that, requiring exact timing and jumping in places - although some of the mini-games are for everyone. Multiplayer can make level traversal more difficult - but that is part of the fun too. e.g. interfering with someone so you get more of the goodies. And each level has at least one 2-player puzzle for extra items. But on many scenes the camera doesn't pan enough, so you can get left behind very quickly - after a short timeout you die until whatever player the camera is on gets to the next checkpoint. I haven't played online.
The user-created levels are a mixed bag - there are so many that it is difficult to find the really good ones. There are too many (utterly pointless and annoying) `get trophies quick' levels high in the list, and people are already asking on forums for levels to be `hearted' for the purpose of gaming the system. So it's mostly just a random guessing game, although with time the tagging system will probably become more accurate and thus useful. Levels load pretty quickly - although i've had some failed loadings, but a retry normally has them work. The polish and difficulty varies greatly, but there is a fairly wild mix, from simple platform games, to races, to puzzle games and so forth - even a side-scrolling shooter with a sort of string-puppet feel to it(!).
The level editor is necessarily quite complicated. But building things out of real materials with motors, switches, connectors and so forth is fairly intuitive - and a heap of fun. It will take a lot of time to create a good level - and with the broken moderation system in place, it reduces the enthusiasm somewhat - it could vanish without explanation if you make a mistake and offend some culture you're not even aware of, for example.
I don't know if the moderation thing will get sorted out. Hopefully it can be toned down a bit - at least it must tell people what they did wrong and give them the opportunity to fix it. There are plenty of people who want to make levels in good faith, but if just one level gets deleted permanently with no explanation or recourse, I can imagine they'll give up on the game forever - and probably be angry enough to let everyone else know too. A tiered system with some more grown-up `may-offend but not-illegal' content would be bloody nice too. e.g. no possibility of art or satire, just `commercial-mainstream that is so inoffensive it manages to offend' type levels - which will end up being boring. A multi-tier system is probably too much to ask, but they clearly need a better system for accidental problems through ignorance - otherwise this game will lose sales.
I suspect it may be a scalability issue - rather than check or edit the levels to fix problems they just blocked the level (or a grief-report blocks the level immediately, until it is reviewed, and they have a back-log). I'm still willing to give them the benefit of the doubt, but I guess time will tell on this one, as it always does.
IView and PlayStation 3
The new 2.5 firmware for the Playstation 3 included some level of Flash 9 support - which opens up quite a few of the sort of sites which one might want to access from a TV connected device. One of these is the ABC's Iview product - which is like a DVR of the last couple of weeks of some of the ABC's TV. I have no idea if it works outside of Australia.
There is a little trick to getting it to work on the PS3, but once that is done, it seems to work ok. The first time you go to www.abc.net.au/iview, you need to check the 'do not show this again' thing, and then restart the machine (not just the browser). Then the next time you open it, it should load up ok - although it can take a little while to get going. Viewing other flash sites in a given session also seems to upset it sometimes, but again a restart should get it working again.
It is a bit slow though. At the full resolution at 1080p, it barely works - infact I got no video at all. At 720p it works, but is a bit jerky, 576p is a bit better, but it's a pain since you need to change the resolution of the whole machine, not just the web browser, and 576P looks pretty bad on a HD TV. Another tip - if you change the 'Resolution' setting in the 'Tools' menu (the actual layer on which the browser content is rendered internally), to -2, it speeds it up further, but at a cost to the video resolution. Still, either way it is fine for watching non-action content such as news and talking head shows. I think Sony need to throw some more effort at the video codecs and flash player, since the CELL should be more than capable of running at least the video at full speed. It would be nice if they implemented 'full-screen mode' too - currently you have to zoom and maybe fiddle with the view a little to get it to show nicely. Perhaps a '-3' setting for the Resolution setting when in 1080p mode could help too, and just scale things up - rather than forcing one to reset the display on the whole system.
The menus themselves are unfortunately all done in flash - which means that although they 'look nice', you need to use a mouse pointer, and can't 'cursor-key' between links as you can on HTML pages. This is mostly just an inconvenience when using a controller, but in some cases it breaks the interface. The widget set they use is a bit shitty, for example in the full programme listing you can only scroll by grabbing the handle in the scroll-bar widget -- which is far to coarse and you cannot get to every item in the list. It would be nice if they had a simpler, alternative interface that didn't do all that pseudo-3d shit and background animations as well - all it does is make it a pain to use. The video quality itself is pretty low - well on a big tv it really shows anyway. It's probably comparable to 'high quality' mode on youtube (when the source material is broadcast quality) - it's ok and quite good for talking heads, but action and pan shots are joltingly difficult to watch. Certainly things could be improved - but it is a nice little addition to the free services available. At least until we get PlayTV over here. Perhaps Sony can try to work with the ABC a little, as they seem to be doing with the BBC's IPlayer, to help improve performance and the accessibility from a controller interface.
Another issue -- iview traffic is all metered for almost all ISP's -- unlike the older ABC video content which most decent ISP's graciously provided as un-metered traffic. This is because of the unfortunate use of Akamai for their content delivery -- I find it somewhat surprising and quite disappointing that the ABC would turn to an international service provider for an Australian service, rather than a local company. There should be plenty which are more than capable of supplying the technology required (Akamai was probably seen as a quick and easy solution - but time will tell if it was a good one). They say they are working on a way around this, but while they use Akamai it sounds like any solution will be flakey (as in iiNet's case it seems, according to whirlpool), and/or costly for ISP's to implement - so they may not do it. I guess we'll all find out soon enough. Although on the other hand, if you have a decent ISP, and aren't using your connection to download movies and tv shows, you probably have the quota to spare. And if you are, you probably aren't interested in iview anyway.
State of Indignance
I'm still around. I haven't been doing anything particularly interesting of late and the copious news and blog reading I've been doing has made me too angry and aghast to feel like posting much.
Recently one of my uncles died - at 98. Very impressive feat, he had a large family and strong ties to his community and was an all-round nice bloke. Not that I've seen him for a long time, but the family used to visit when I was a kid. I hadn't quite realised how tied into the church his family was. It was interesting to see the role that the Church played in forming and bonding his community together. It got me thinking that maybe this church thing isn't such a bad idea after-all. But then during the sermon it veered off into dreaming about him enjoying his after-life, and it just felt sad, all of these good people believing in a silly fantasy. Although they did celebrate his wonderful life as well, the emphasis on the after-life seemed both unnecessary and childish.
But apart from that, it got me thinking about how church and community goes together. He was part of a small country town in a productive part of the country (even drought years aren't so bad there generally). In such a setting where most of the community knows each other, I can see a church as being a quality way to help people socialise, and provide some common ground to bind people together. But does this work in a larger town or city? I suspect it does not scale very well. And like many other things, the scale of humanity has out-grown these ideas, and it is probably time we moved on. Which society is doing anyway, as reflected by census results.
The US election. Wow. That's been quite an interesting one to follow. Normally I am not to fussed about American politics, at least to the level of following an election campaign (even when I was in the USA in 2000, or was it 2004), but this year something has been different. Certainly there's no shortage of `character' in the players this time around. Which makes for some interesting opinion pieces out there. The hate and racists fuelled republicans really seem to be coming out of the wood-work, encouraged in large part by the divisive wedge politics the right loves to play with. And I think like many, the prospect of an educated and thoughtful new face running the USA for once (during my life-time) is an intriguing prospect. And really, for the sake of humanity, I think all the rest of the world is hoping for an Obama victory; McCain is one angry, grumpy old man who is likely to do anything in a fit of rage, or just die of old age, and Palin just wants to estabilish a fascist theological state and probably help accelerate the apocolypse (and all that before dinner with the family). Of course, with the (world) economy in such a mess, victory may be a bit of a poison pill, but there isn't much choice. Another point of interest is how voter-disenfranchising activity is even remotely legal or tolerated or even considered in the first place. What sort of a fucked up `democracy' is that?
Ahhh the world ecomony. Hasn't affected me at all yet -- apart probably, from a bit of super which didn't make money even when things were going well. So again, fascinating to follow the ups and downs of the stocks and whatnot. I can't really add much to the teeming cesspools of comment already out there -- much of which I have read -- only that how much it sucks that at the end of the day, the rich will get richer and the poor and middle will pay for it -- again. A few new regulations and some hardship for a few years -- until the cycle repeats itself. But while capitalism reigns there is little choice -- not everyone can be rich, so plenty have to be poor. At least while the rich get unfair electoral representation (i.e. can bribe officials).
Where is the world headed I wonder? It is easy to get pessimistic. Even without the threat of global warming things are not looking terribly rosey. With the world population continuing to rise unabated, with land continuing to be degraded beyond use, the sea over-fished, toxins and contaminants continuing to build up in the food chain, water being poisoned or hoarded, personal greed continuing to trump national welfare, is there really a bright future for humanity? I think an area to watch in the next few decades may be India -- as a representation of the problems to be faced by the entire world, particularly over population, pollution, religion fueld ignorance, wealth disparity.
Add global warming and things could really get nasty. I read a few sceptic and science blogs, and it is surprising how many of them are overly sceptical of global warming (sceptical should mean require hard proof, not just being universally cynical), or don't see it as an issue to be concerned with to the point of regulation or spending money. Maybe in England some nicer weather isn't seen as such a bad thing, even if it means wilder and more frequent storms occasionally. But there are going to be some pretty nasty consequences even for them (disease spread, costlier food), and the risk and cost that we are trying to fix something we had nothing to do with pales into utter insignificance against the risk and cost of not fixing something we actually caused.
And the more mundane. Work continues to be pretty dull. Mostly writing little data importers, and doing a lot of manual data verification and manipulation. I haven't had to write any really new code for months. I tell you what though, Microsoft products and their proprietary file formats have meant a lot more work for me, and a lot more frustration. Where I can, i've moved to using simple CSV files for most data files. Apart from being trivial to read and write, they are also bloody fast! But excel has to make using any `non-native' file format a right pain in the arse. To just save a file in CSV format takes 4 extra clicks, and it still warns you when you close the file that you haven't saved all changes -- even if you have. So a big F.U. goes out to B.G. the big C for helping to make life more difficult for all of us with your crappy tools and shitty file formats.
Haven't bee playing many games either. I got to the last battle in Rogue Galaxy but can't be bothered finishing it (lack of save points). I'm pretty pissed off LBP got delayed, particularly as it certainly looks on the surface to be pandering to irrational beliefs of a random internet poster, but i'll keep those thoughts to myself, at least for now. I just hope this doesn't indicate a predisposition to censor `offensive' content once the service goes live. And speaking of censorship, that idiot Conroy should go for trying to bully Mark Newton for stating the obvious about the ludicrous scheme to filter the entire Australian internet.
Tearing, Game Demo's and Controls
Ahh game demo's. Do they really convince anyone to buy a game? Or do they just convince them not to buy it? I can feel a whinge coming on ...
A couple of new demo's on the PSN today. Mercenaries 2, and Fracture. Actually neither are games I thought looked interesting enough to buy before I tried their demo's, but having tried, they're even less likely to be swapped for some hard earned plastic.
And the main reason? Controls. Both let you 'invert y', but not x! For some reason - I think Jak and Daxter - I learnt to use camera controls opposite to the rest of the world.
After a couple of minutes of looking at the floor or the sky or spinning in circles, I gave up on Fracture. It was just too frustrating and annoying - even for a demo. They've obviously put a lot of work into it, and other than that, from the small demo I saw, it is probably a competent game, but without inverted controls all it ends up to me is deleted from my hard drive. From what I could tell, the ground-altering mechanic is a little odd as I expected it to be - a bit neither here nor there, but I guess it could 'work' ok if it's use isn't too gimmicky or forced.
Mercenaries 2 wasn't much better. I ran around randomly and somehow ended up where I was supposed to, but well, died. It looks like it could be fun, but running around looking the wrong way isn't. I like the stylised graphics, and the explosions are nicely done.
Both suffered another major turn-off for me too - screen tearing. Where they were too memory-strapped or lazy to use multi-buffering. Some devs claim that using double-buffering would halve the frame-rate when they just-drop a frame, which is true (if you just miss a frame on a 50fps animation, you have to waste processing/wait a whole frame before you can flip, and you end up with a 25fps frame-rate, and spending approximately half the available time/cpu power doing nothing) - but triple buffering doesn't suffer this problem - it's mostly just a memory cost. Well at least the tearing was only minimal, but still it was there, and it is a visual glitch I've found particularly irritating ever since I first saw it on crappy PC games that didn't have the hardware to easily avoid it on every frame (the way Amiga hardware worked you had to go out of your way to make things tear, so it was a real shock). It's such a tiny little nod to quality that can't cost more than a tiny fraction of the ginormous bloody budgets they spend these days - I can't see how the art department could sign off on such a sloppy trade-off (versus dropping the texture resolution slightly for instance).
Ok yes, they were only demo's - and sometimes problems like this get fixed by release time (Burnout?), and often control inversion is also included in the final version - but that doesn't help in evaluating a game from it's demo. It's not like I had planned to buy either of those games (I guess they're not my type of games), but the demos didn't help to convince me otherwise.
While i'm on demo's, last week we had 'Pure', a sort of trick-bike off-road racing game. Weird choice of game mechanics. You have to do slow and clumsy and hard to pull off `aerial tricks', otherwise you don't get enough boost/juice whatever they call it to be able to win a race. Sounds pretty tedious to me. Actually I may have had enough off-road racing with Motorstorm - I'm still not even sure I'll get Motorstorm 2. Well, the local split-screen would be nice, and maybe it'll load tracks and cars faster. Pure does look nice though.
I bought a new HDD for my PS3 yesterday, backed it up and installed it. Very easy process, although the screws holding the drive in its caddy were a bit tight for the jewellers screwdriver I was using, and it took a couple of hours to back it up and restore it (nearly 50GB used). Although disturbingly, it sometimes seems to start up a bit too slowly, and then goes 'missing' at power-up until a restart. No big deal I suppose - if that's all it does. It's a western digital 320GB drive (WD3200BEVT), fwiw. Ahh, i did a search, and it seems to be some sort of interface issue - I tried jumpering the 'RPS' mode on, and so far it looks to have done the job.
I didn't bother backing up the Ubuntu partition, beyond my source tree. I haven't been particularly happy with Ubuntu, and that was even before upgrading to 8.x broke everything. Even after I fixed the boot issue, all it did was leave me with an unusable amount of RAM detected. I had long since got rid of any Ubuntu on my laptops, so I didn't need much of an excuse to jump ship. As i've said elsewhere - i'm sure Ubuntu is just fine for plenty of people, but it certainly isn't for me.
So I spent a bit of time trying to work out what system to install. It was pretty depressing really, it was quite difficult to find ANY quality or useful information at all (or maybe it was this incredibly disturbing video and comments I'd seen earlier in the day?). There are a few blogs and news sites around for PS3 development, but many (most?) of them are quite stale. Often started in a flourish and soon forgotten.
Of those still active, the developerworks forum for Cell development is full of newbies with very basic GNU or C/parellel programming questions for the most part, or weird arguments about performance (e.g. 'why is the ppe so slow?', 'how come i can't get the peak theoretical bandwidth in a memcpy?', sigh.). The beyond3d and ps2dev forums seem to be stuck on the fact that the GPU is inaccessible, or relying and waiting on a couple of guys working on ps3-medialib to deliver some magic, and generally just don't seem to be all that helpful. There are a couple of queries about useful linux distributions, but they are either unanswered or not helpful to me (e.g. they recommended ubuntu). I was starting to think that the whole situation was a lost cause, and certainly nobody seems to be working together toward any common goal. I finally stumbled upon PS3 forums - which seems to be a bit more active, and a few of the sort of questions I was interested in at least had some sort of answer.
Anyway, since there was a decent article on DeveloperWorks about installing FC7, and the IBM SDK 'support', I thought i'd give that a go. Burnt a DVD and away I went - I even checked the media. But unfortunately, it couldn't find the DVD it booted off when it came to looking for packages for some reason - so I couldn't get any further. Bit of a waste of time. I couldn't find any mention of this show-stopper on the 'net, so I gave up. I have FC9 on my laptop and although i'm happy enough that it works, the default setup is far too fat for a PS3, and it took forever to install, so I thought i'd give it a miss.
Someone on ps3forums.com had suggested Ubuntu to a query of what distro to use, but then changed his mind and complained about how much of a timewaster it was, and suggested YellowDog. I hadn't really considered YellowDog - it seemed a bit out of date, and well, just different, but after my experience upgrading Ubuntu - to get features I didn't really need - I thought stability and ease of setup will do over bleeding edge. So, YD downloaded (fortunately my ISP mirrors all these DVD's, so the download is as fast as the phone line can muster). Hmm, Fedora 6 based. Ok, so it's a bit old. Still - the install worked just fine first time. It was also pretty fast, and it boots up pretty fast too. Both much faster than Ubuntu ever did. For some reason the default 'development' install doesn't include ppu-gcc and particularly ppu-binutils, but I found out what I needed, and it seems some of my test code can build and run. I can always compile a newer gcc if I need it.
Ahh well that's done, and i've updated it too, now I can reboot back to the GameOS and forget about it for another few months!
I recently posted my last entry on b.g.o, and I said I wasn't going to rant about what is wrong with the desktop (well I did before I deleted it). But maybe I should have, as with fortuitous timing, my second to last entry about Chrome should have reminded me what Chrome is capable of. I will only say in my defence that I was only considering Chrome as a browser, and maybe as an `ms office' replacement, and dismissing views otherwise (well that is how I use a browser).
First, some background. I had been noticing the trend to move toward Python in GNOME in particular, and I haven't liked it. I know why developers like it (well why they claim to like it), but as a user it leaves a lot to be desired - slow, extremely heavy applications, that too-often bomb out with meaningless backtraces. I had some ideas that could make it palatable to users (well, beyond just debugging), but it relied on some features which Python lacks, so I gave up thinking about it. But Python isn't the only problem.
The GNU desktop is in an awful state - and that's even if you stick to just one flavour and it's attendant applications (I don't know about KDE, but the following is true of both GNOME and Xfce). If you take a default install of your average `distribution', for example, Ubuntu, after installing a rather large number of packages you end up with a pretty login window, and a relatively pretty desktop, and quite a few applications, from basic to outstanding, from buggy to stable. But what is behind the actual desktop? A mis-mash of random programmes the packager/desktop team determined to be useful for themselves or some mythical `average luser'. Some work well, some don't, some are necessary for the basic operation of the machine (auto-mounting and network selection), others are pure fluff, most are in-between. Also - it barely runs ok if you have only 256MB of memory, for example that `older machine' that GNU/Linux can supposedly take advantage of, or embedded/special machines, like a Playstation 3, both of which actually affect me.
One problem is that the `in thing' these days seems to be to write (or re-write!) many of the applets/applications that provide core desktop functionality using Visual BAS... oh oops ... Python. Now Python is a `scripting language'. This means that every time you run a python ap, it must compile the source-code into byte-code or perhaps machine code (I do not know if there are pre-compilers for it). This takes time, and it takes memory, and to do it well it can take a lot of memory and time, and this is one reason traditionally that developers had much beefier machines than users - because they're the only ones who had to do this step, once. If it only compiles to byte-code, then every basic instruction is emulated using a state machine - a 'virtual machine' (VM), which is at least and order of magnitude slower than the physical machine is. Any conversion to machine code and further optimisations which make the running speed faster, also generally cost in memory and cpu time during the compilation phase. For simple scripts and applications this is no big deal, but for more complex applications it can start to add up. Not only that, because many of the libraries themselves are written using the scripting language, every application which uses those libraries needs to recompile the same libraries every time they run - and more importantly store their own copy of the byte/machine code. I will also mention in passing that many of these `libraries' are just `wrappers' - glue code which just calls some `C' library to do the actual work; but someone has to write those too, so either the script engine `vendor' or the library `vendor' must expend additional resources (which wouldn't otherwise be needed) for this work, so the cost isn't born solely by the users.
Scripting languages are just fine for short-lived applications, they run, do their job, and finish, releasing the memory they used - even if it is excessive it doesn't usually matter. And often they are `batch' processes anyway - non-interactive programmes which run by themselves, and so long as they run to completion they needn't be particularly speedy. But now with applets and other trivial applications that run for the entire time you're at the computer, or they require interactive response, they are a potential disaster. You now have a separate VM for every application loaded, with all the non-shareable data that entails. Often scripting VM's haven't even been designed with this in mind, and in that case they may be quite cavalier with their use of memory because it isn't an issue for the workloads for which they were designed. Most of these languages use garbage collection too - but garbage collectors are quite hard to write properly, so there are often bugs, but even when those are all fixed, to get performance they generally need more total memory than they're actually using (sometimes by a lot, but often about twice). And again, all of this overhead needs to be duplicated for each VM running. Contrast that to say a C application. When an application is compiled in the normal way, all of the code, and all of the code of the libraries can be shared in memory. Far more time and memory can be spared during the compilation phase, since it is only done once. And explicit memory management at least forces you to think about it, even if you don't take advantage of that opportunity for thought (even if explicit memory management has spare/overheads for efficiency, it's a trade-off you can control). And finally, often the reason programmers use scripting languages in the first place is because they are easier - or to translate (in some cases) - they don't know any better. Although they may have the enthusiasm and the ideas, they may just not have the skills to pull it off properly.
Another problem affects all languages - that is the startup time/non-shared data overhead. Things such as font metric tables (sigh, and font glyph tables/glyph cache, now the font server has been basically dropped - remote X sucks shit now, even though networks are much faster), display information, other global state tables, and other data which is loaded at run-time, and could otherwise be shared among applications. This only gets worse when you have many versions of the same library present, and/or completely different libraries which do the same thing. Sure you can run a KDE application on a GNOME desktop, but it isn't at a zero cost, as even basic things like displaying a string of text involves an extraordinary amount of logic and data, little of which will be shared.
Having so many libraries to choose from, and indeed a continually changing set of libraries to choose from, is also a particular problem with GNU desktops (and Windows at least). Add to that - people keep coming up with their own `framework' which will `solve all the problems' in a specific domain, but all it really does is add yet another set of libraries (and versions over time) that we all have to put up with if we want to run a particular application that uses them (or worse, the poor developer is burdened with having to develop and maintain yet-another backend when they could be doing real - and more importantly; interesting - work). Even if the one library is the one everyone uses, new versions seem to come out every year or so.
So the result is, that in 2008 we have a desktop with barely more features than one in 2000, yet consuming far more resources. Tiny little applets which could just as easily been written in any language, are dragging in millions of lines of code and megabytes of memory by virtue of being written in a scripting one. Lots of libraries - many which do the same thing, even just different versions of the same one - often end up being installed as well.
There are at least a couple of ways to get around the scripting problem, and they also cover the shared state and library's breeding like fundie children as well. If you're not using scripting they don't help - but shared state could be addressed using traditional IPC mechanisms (i.e. use a server), but because of the complexity this is often not done. Fixing the breeding library problem in general is tricky - each library needs to be far more disciplined in their design, and make use of ld features for backward/forward compatibility if required. Some duplication is still necessary - competition is generally good - although perhaps application developers should avoid using every new library that comes out just because it is new and promises to abolish world hunger.
First possibility, you have a separate process that compiles and executes all scripts - a script `application server', in today's language. For a stand-alone script, a small client uploads/tells the server which script to execute, and the server sends the results back to the client using queues and/or rpc. Because the scripts are executed in the same address space, they can share libraries, the garbage collector, and other resources. You also have the benefit that if you want to extend your application with scripting facilities, any application can use the same mechanism to run their own scripts. This could also provide a powerful system whereby you can write meta-applications, talking between applications as well, if you design the system properly. Threading is an issue - but it's an issue that only has to be solved once, by people who probably have an idea, rather than clueless application programmers.
The other way is to move your applications to the (one) server. All applications simply run in the same VM/address space, and again all code and much data can easily be shared among applications. Where you need additional non-scripted facilities you either build them in/use plugins, or use IPC mechanisms. And you only have to do it once too. Although meta-application programming is certainly possible, it would have to be an additional layer or protocol that needn't be there by design. And you can't really write an application that has a scripting `extension mechanism' either - since the app is the script.
The first way is sort of how AREXX worked. It can be quite simple, yet very powerful. Nobody wrote applications in AREXX, but they did write meta-applications which literally let completely unrelated applications `talk' to one other. The second way, if taken to the extreme, is something like JavaOS or that M$ thingy that does the same thing.
Hmmm. So I guess one potential realisation of the second idea is Chrome. It isn't a browser, it's an application framework, or rather, an os-independent application execution environment, a meta-operating system if you will. The sort of thing Java was capab;le of, but didn't work so well because it was too fine grained/no central server. The sort of thing Flash is basically doing now, although it's too buggy and also no central server. Probably the closest is the sort of thing GNOME was originally envisioned to be (as i fuzzily remember it - the NOM in GNOME) before being down-graded to basically a Gtk theme - although the glandular-fever infected among them are still thinking along those lines, I think. The sort of thing Firefox always claimed to be, but you couldn't take seriously because we all know what a bloaty pig's bum it was, and still is, even though they've made great strides in the swine's bun-tone. Well, at least the process model in Chrome makes sense now.
Ok, so perhaps I was wrong in my second to last post on b.g.o. Chrome isn't just another featureless webkit browser after all (although it is still too featureless for me). But it isn't just Firefox that has to fear from another browser, it is not just desktop applications that have to fear from another browser, it is the desktop as we have come to know it - and thank fuck for that too.
Ahh well, maybe that isn't the idea `they' had. It has the potential though, if the VM and GC is as good as the claims on the box. And if Google doesn't do it, someone else can - because it's free software.
Copyright (C) 2018 Michael Zucchi, All Rights Reserved.Powered by gcc & me!