Category Archives: Computer stuff

Sheffugees

This weekend I attended a “Sheffield refugee hackathon” organised by the folks at Yoomee. I really wasn’t sure what to expect, having never been to a hackathon before, and being unsure how well the fairly specific set of application-development skills I’ve been using over the last few years would generalise to building something that could be of benefit to refugees and asylum-seekers, but it was a great experience and I’m really looking forward to the next one (which should be happening in around 6 weeks time).
Continue reading Sheffugees

Learning Software Development with Microsoft Office

I was recently listening to a JavaScript Jabber podcast featuring Dan Abramov, where he stated that he got into software development because at school he was taught PowerPoint: he loved making animations, and one day discovered the macros that powered those animations. These macros were actually scripts: by changing values within them, Dan could programatically alter the animations.

This reminded me of my own route into the software industry: though WordBASIC (most of you under 40 probably won’t remember WordBASIC: it’s what Word macros were written in before we had Visual Basic for Applications).
Continue reading Learning Software Development with Microsoft Office

Functional JavaScript

Two or three years ago, my polyglot colleague Dave Spanton persuaded me to try functional programming. I took a few basic Haskell tutorials, but went no further. I got the sense that there was a far deeper seam there which I really needed to dig into, but time, and the pressures of work, gradually drove the need out of my mind.

In the last couple of weeks, I finally made time to go deeper. And I did so by taking the Hardcore Functional Programming: Advanced JavaScript Coding course, by Joe Nelson and Brian Lonsdorf, on Udemy (Udemy had one of their sales on — I swear they have more sales than DFS — and so I picked the course up for just £12, rather than the £78 advertised).
Continue reading Functional JavaScript

TDD: When Not To Unit Test

Often when I speak to development teams about their technical debt, one of the issues they highlight is lack of unit test coverage. “We only have 30% coverage, so we’re hoping to set aside some time next sprint to get more tests in place. Our latest work all has 100% coverage, but there’s a lot of code from way-back-when which is completely lacking in tests”.

This seems to me to misunderstand the purpose of unit testing. I can see how this misunderstanding comes about: there is a general acceptance that tests are good, and that a high level of test coverage is good, therefore increasing coverage must be a worthwhile thing. Right?
Continue reading TDD: When Not To Unit Test

Trash

In 1996, I was responsible for the “kiosk” in Diesel’s Covent Garden flagship store (a Mac running the Diesel website). I had to go into the store once a month to “fix” it.

On the website were two video ads and a handful of audio files. Netscape (1.2, I think) treated these links as “downloads” to be opened with a helper application. Every time somebody using the kiosk clicked on a video or audio link, a new copy of the file was “downloaded” (from the copy of the website stored on the Mac’s hard disk), and placed on the desktop. When I came for my monthly visit, the hard disk would be full, and the desktop would be stacked 6 or 7 deep with icons of the same few files.

My job then was to delete these files. Macs then (OS5 or 6 – or was it 4?) were a lot simpler than they are now, and I myself was no Apple genius. So I had to drag all 9-gazillion of the files into the Trash. Which was a problem. Because the Trash (and, indeed, the hard drive) was an icon on the desktop. And the Desktop was geological-layers-deep in icons. (And, because the Mac wasn’t totally locked down, the Trash icon itself could be anywhere on the Desktop).

And so I began an elaborate game of Towers of Hanoi. Before I could delete the files, I had to find the Trash. So I would painfully drag the files, one at a time, until I unearthed that little waste mpeg basket. After an hour or so of this, I would unearth the Trash icon. And then the work would begin all over again, dragging the files into the Trash and, finally, emptying it.

I’ve a sneaking suspicion that this may be what first triggered my RSI; and my hatred of drag-and-drop as an interaction mechanism; and, quite possibly, a lasting suspicion of all Apple products.

From Banners to Apps

Last week I gave a presentation to the Midlands Flash Platform User Group. This was the result of some thoughts & conversations which started to fizzle around my brain during last year’s Flash on the Beach conference. The talk “From Banners to Apps” was a brief (ish) distillation of my 15 years’ in the Internet industry – what I have done and what I have learned. I was quite pleased with how it went (although it was far from perfect – if I were to do it again then I would try to encourage a bit more two-way communication with the audience).

Here is a PDF copy of my presentation, complete with vaguely-cryptic presenter notes.

Am I Completely Insane Or What

Another ActionScript-related memory-usage post. I’ve been doing some experiments with flash.display.Loader (a class which has always seemed out to get me). Today I discovered something which is just so weird it makes me doubt my own sanity. I’d be grateful if any Flash/Tamarin experts out there could help me verify my sanity/insanity.

Here is the test script I’ve been running:

public static const NUM_LOADERS:int = 500;

public function TestMultipleLoaders()
{
init();
}

private function init():void
{
for (var i:int=0; i < NUM_LOADERS; i++)
{
createAndDestroyLoader();
}
}

private function createAndDestroyLoader():void
{
var loader:Loader = new Loader();
loader = null;
}
}

Compile this in Flash Builder and test it under the Profiler. Change the filter to include objects in the flash.* package. Observe how many Loader instances are in memory. Hit the garbage collector. Observe again.

If your system is running anything like mine, you will see 500. Which, in itself, is crazy. loader is just a local variable which, anyway, is set to null. But bear with me, this is going to get crazier…

Now try playing with the value of NUM_LOADERS. Again all things being equal, you will see this crazy behaviours (NUM_LOADERS Loaders persisting in memory) for any number between 1 and about 600. Somewhere around that figure, perhaps a little higher (it doesn’t seem to be predictable) you will see the number of Loaders persisting drop to either 1 or 0.

Now what the hell is going on here? My guess is that the Loader class is somewhat resource-intensive to create, and so Adobe are maintaining a pool of them somewhere, although the strategy for doing it seems a bit random.

Please can somebody, anybody, enlighten me?

Embed types in ActionScript and memory usage

I’ve spent the last few days doing lots of fascinating ActionScript memory-tests – and hopefully I’ll post some of the results here if I get time – but while I have a quick moment I thought I’d share this finding which (while obvious now I think about it) caught me out.

The Embed meta-tag allows you to embed and access external files directly within your SWF, e.g.

[Embed(source = 'myImage.png')]
public static const MyImage:Class;

Flash seems to automagically detect the MIME-type of your embedded content (in this case, image/png), so that when you call new MyImage() the resulting object can be cast to a Bitmap.

You can, however, explicitly set a MIME-type for the embedded asset. If you’re crazy enough, you can do this:

[Embed(source = 'myImage.png',mimeType='application/octet-stream')]
public static const MyImage:Class;

This time calling new MyImage() will return an object of type ByteArray; in order to convert it into a bitmap, you will need to load the ByteArray into a Loader object.

Now, what caught me by surprise is the way in which the Flash compiler embeds the file myImage.png; I had foolishly assumed that the binary file would be embedded as-is, and then handled appropriately at run-time, but the compiler is a little smarter than that, and tries to handle the binary data according to its MIME-type. This is probably best demonstrated by example. In my test case, I embedded a large uncompressed PNG – the file was 1280×720 and came out at approx. 2.7MB.

With the first style of Embed (the “regular”), my compiled SWF was approx. 1.7MB or so in size, and when I ran it it decompressed to a similar size.

With the first style of Embed (the “byteArray”), my compiled SWF was a much smaller 800kB in size, but when I ran it it decompressed all the way back to 2.7MB.

I’m still trying to get my head around the implications of this (with a lot of help from Tish!) – it seems counter intuitive to me that the decompressed file sizes are so different, when presumably the “regular” version will have to be decompressed to a full 1280x720x4 (ARGB) bitmap data object. Any thoughts?

Some iPlayer Performance Tips

Yesterday, Amy posted this on Facebook:

Amy Dutronc wishes that iPlayer worked properly. It’s like listening to the radio and watching a really boring slideshow.

It soon turned out that lots of other people were having the same problem. They all have good Internet connections, so that wasn’t the issue- actually, even when bandwidth is low, iPlayer has some amazing built-in logic for detecting this and respondng accordingly. The issue is that some of the high-quality video now available on iPlayer requires lot of decoding power, and some computers – especially older ones and Apple Macs – aren’t up to the job. (NB. I believe there are improvements in the pipeline which will help iPlayer to improve playback even on slow machines – but if you’re still unable to get decent quality playback, the tips below may help).

The first thing to check is that you have the most up-to-date version of the Flash Player plugin. Adobe have done a lot to improve video performance (and performance in general) in recent releases. If you’re feeling particularly brave, you can install the beta version of Flash Player 10.1 which has even more performance improvements. This will especially benefit Mac users, as the new “Gala” preview release is the first one featuring hardware video decoding for Macs. NB if you do install the Gala preview, you will sometimes see a white square in the corner of your video – so you may want to wait instead for the public release.

If, despite having the latest Flash Player, video still runs jerkily, here are some tips. Try them in the order shown below until you reach a level of quality which your computer can play back without stuttering.

  • Don’t play the HD version of programmes. Obviously, HD is amazing; if your machine will play it then you should definitely choose the HD option. But if your machine is a bit old, or does not have a good video card, then HD can slow it down to a crawl. On each HD programme page is a link underneath the video saying “Also in normal quality”. Click that link for a version less likely to hammer your machine.
  • Play the smaller version of the video. On “normal” programme pages, the video has an icon in the top right-hand corner showing two arrows (Update: on the new beta version of iPlayer, the size-toggle icon has moved. It is at the bottom of the media player, in between the volume and fullscreen buttons.). If you click on this, it will toggle between a big and a small version of the video. If you have problems playing the big version, click on the arrow to shrink the window down. The two actually use different video files (encoded at 1500kbps and 800kbps) – you can tell which version of the video is playing by right-clicking in the video window: a menu will pop up, and the second line will say something like “1500kbps | h264 | AK 3.5 (1) | 832×468“. The first part of that line tells you the bitrate.
  • Play the low-bandwidth version. If your machine is so clunky that it struggles even with the 800kbps video, then there is one more option: the low bandwidth version. Normally you would only see this version if your Internet connection is very poor – but you can force iPlayer to play it by clicking on the “Use lower bandwidth version” hidden near the bottom of the page. Once you’ve done this, right-clicking on the video will tell you that you’re looking at a 480kbps version. If you want to swap back up to the higher-quality version, the link at the bottom of the page will now read “Use normal version” – just click it.
  • Hopefully by following one or more of these suggestions, you’ll be able to find the best performance level for your computer.

    Disclaimer: this is not an official post from the BBC: although I worked on iPlayer and am familiar with most of the technologies used in the Embedded Media Player, I am no longer affiliated with the BBC in any way. Also, iPlayer technology can and does change rapidly: I cannot guarantee that all of the above information will still apply.