Recovery: “Musical” SyQuest Cartridges with DoubleSpace Compression

I’ve long offered my “volunteer” data recovery services for those with 44/88/200Mb 5.25″ SyQuest disks, and it’s been a little while since my last SyQuest recovery post. Just when I thought that maybe nobody else would be interested in it, I was contacted by Producer/DJ Ido Amin who had three 44Mb cartridges with some old art and music projects which he wanted to recover. I extended my data recovery offer to him, and soon enough, a parcel arrived at my doorstep.

Step 1: Examination and Physical Recovery

The disks were well packaged and arrived safely. Inside, there was a letter –


Knowing from the past that bad cartridges can damage drives, I took some time to inspect the cartridges. The cases it arrived in were a little dusty and “sun-tanned” (yellowed plastic), but the disks inside were mostly clean.

Using my Linux-based recovery box, I used ddrescue to recover the data into raw disk image files for further analysis. One of the three cartridges read out without any faults, another had a 512 byte bad area that was recovered after about 100 retries.


Unfortunately, the last disk was not as lucky – it had a 512 byte bad area that could not be recovered even after >4 hours of attempts (recovery restarted a number of times). The error seems to suggest that the drive was unable to locate the sector itself on the media, so maybe a write failure or magnetic media failure is responsible for the flaw.

Step 2: Image Analysis

Unlike all the other SyQuest drives I’ve dealt with, all three cartridges were MS-DOS formatted. One of the cartridges contained straight files which could be extracted directly. This makes things easier for me in a way, except for one little problem. The remaining two of the three cartridges were compressed with DoubleSpace.

dblspace-filesFor those who didn’t live through this era, DoubleSpace was a transparent full-drive (or partial drive) compression utility included with MS-DOS 6. This was in an era where CPUs were relatively under-powered, I/O was slow and space limited. Including DoubleSpace allowed users such as myself (at that time) with a 40Mb Miniscribe MFM hard drive the chance to run Windows 3.1 alongside a full complement of MS-DOS programs, albeit slowly. The whole system worked by loading a module (DBLSPACE.BIN) into memory that mounted DBLSPACE.00? files for reading and writing. The actual compressed drive would appear as C (e.g. if your C drive was compressed), and the “real” C-drive (i.e. the host drive containing the DBLSPACE.00? file) would be mounted at a higher drive letter such as F.

Unfortunately, such volumes are really only able to be accessed with the software they were written with. No DoubleSpace compatible utilities are widely available today. To compound the matter, DoubleSpace was very shortlived due to a lawsuit bought upon Microsoft from the competing disk-compression-utility company Stac Electronics. This resulted in DoubleSpace being removed in MS-DOS 6.21 and being replaced by DriveSpace in MS-DOS 6.22 (and onto Windows 95/98).

Step 3: Recovering the Data

As I don’t have a working MS-DOS installation on a computer complete with SCSI drivers for a SCSI card, getting access to a 44Mb cartridge image under MS-DOS posed a bit of a challenge.

To solve this, I decided to use my MS-DOS 6.22 VM in VMWare as my recovery platform. A 520Mb primary drive had MS-DOS 6.22 installed. To import the image “as if it were an attached SyQuest drive” requires converting a raw image to a .vmdk file. Unfortunately, there isn’t really any straightforward way I found to do this.

Instead, to do this, I first created a pre-allocated hard drive of a larger size as a secondary drive. I selected 0.1Gb, as this is larger than the 44Mb cartridge image. Then, I recognized that the .vmdk file containing the 100Mb file is just raw sectors, I overwrote the first 44Mb of that .vmdk file with the cartridge image.

When the system was booted, a D: became accessible and the file listing matches the WinHex analysis above. So far so good. To try and mount the image, I tried using DriveSpace as Wikipedia states:

MS-DOS 6.22 contained a reimplemented version of the disk compression software, but this time released under the name DriveSpace. The software was essentially identical to the MS-DOS 6.2 version of DoubleSpace from a user point of view, and was compatible with previous versions.

To get access to DRVSPACE, I had to set it up by compressing a volume. I created a blank third throw-away .vmdk to compress, so that I could bypass the set-up wizard. This .vmdk was then deleted.


Despite what was on Wikipedia, DriveSpace is not able to mount DoubleSpace volumes. This was unfortunate. Further investigation seems to suggest the following –


This implies that users who upgraded from MS-DOS 6.22 from 6.2 could still continue to use DoubleSpace (probably by preserving the DBLSPACE.BIN and existing set-up). Those who wished to convert the DoubleSpace drives to DriveSpace entails full decompression and recompression. Those who installed MS-DOS 6.22 from scratch do not have DoubleSpace compatibility, which is unfortunate.

To solve this, I instead looked for the necessary file – the only option was a MS-DOS 6.2 Boot Disk with DBLSPACE.BIN. I managed to boot the VM from the boot disk to success – the DoubleSpace volume auto-mounted as D: and files became accessible.

The problem is now how to get the data out of there? Unfortunately, as xcopy was not available on the boot disk, and executing xcopy from the C: MS-DOS 6.22 install results in “Incorrect MS-DOS Version”, it required another tool. In this case, dosver was injected into the boot disk using WinImage, so that it could be executed to overcome the incorrect version message. This allowed me to finally copy all the files onto the uncompressed C: which I could then use WinImage or WinHex to access the .vmdk and extract the files. Job done!

It may actually be possible to copy DBLSPACE.BIN from the boot disk to the root directory of the MS-DOS 6.22 installation and reboot from that – with any luck, the OS might pick up on it and load it upon a restart, but I didn’t try this.

Step 4: Returning the Data

Interestingly, within one of the DoubleSpace volumes, there were a few ZIP files. This was on the cartridge with 512 bytes unrecovered. However, the recovered ZIP files were decompressed without any errors, and further analysis showed that the DBLSPACE.000 file just so happened not to use the bad sector, so no data was actually lost. What a good coincidence.

As usual, the files were zipped up and returned via Dropbox share. However, many of the files would have been created with old software and programs which are not able to run under modern computers. That being said, it is still relatively easy to emulate an old MS-DOS or Windows 9x machine under a virtual machine, so it’s not wasted in the end.


Another set of cartridges arrived for the voluntary data recovery service. The cartridges did put up a fight, but the physical recovery was mostly successful. The biggest problem was the use of DoubleSpace full disk compression, which became uncommon and rare due to a lawsuit. If it were not for the fact that I somewhat understood DoubleSpace and DriveSpace from having used it myself as a child, recovery of the data within would not be considered straightforward especially if it were not possible to source a boot disk or install disk set for MS-DOS 6.2. With a bit of “massaging” of image files, it was possible to perform the recovery using a virtual machine, which is more fun than trying to cobble together flakey old hardware.

Comment from DJ/Producer Ido Amin

Remixes are very common these days, but producing them in the late 80’s and early 90’s was quite a different story. Sampling on a home computer was a very novel idea, with challenges in hardware, software, speed, memory and disk space. I milked the best in this regard from 3 different home computers (an Apple ][ with a Greengate sampling card, an Amiga and an XT). I already had one remix hit, and was thrilled when Greg Hendershott’s Cakewalk 3 (1991) appeared, and I was able to create some more remixes on the relatively vast storage of some second-hand 44mb Syquest cartridges.

Twenty-five years later, computers have improved, sample rates and bitrates have quadrupled, and remixes are culturally acceptable (hallelujah!). As a form of digital crate-digging, I wanted to revive the early sample projects and re-do them.

But 25 years later, Syquest drives are no longer around. Searching the internet, I found Gough Lui from down under – pretty much as long as long distance gets! – and he kindly helped me resurrect the material, even visiting the netherworld of defunct OS systems with great skill to salvage what was possible.

Among the data salvaged, I’m thrilled about one specific audio project – a remix for one of Aris San‘s tracks, which had a title identical to my girlfriend at the time. For an experimental artist, one of the gifts of the digital domain is the ability to go back and re-mix your project, having the luxury of hindsight. The old ’91 remix is on my SoundCloud – email me for a private listen link.

Aris San was a fusion artist – playing energetic and soulful Greek music on an electric guitar, rock’n’roll style, instead of on a Bouzouki. That’s the spirit of all music – evolving, adding, creating fusion and integration. The passion for remixing is really the classic, traditional approach!

Posted in Computing, Tech Flashback | Tagged , , , , , , | Leave a comment

Analysis: Freeview Sydney Post-Launch of ABC HD (6-7th Dec 2016)

Yesterday marked the launch of ABC HD, which finally means that Sydney Freeview gets each broadcasters’ main stations in high definition at long last. According to the launch material, the broadcast would be made in MPEG-4, but of course naturally people did ask me to analyze the service.

Around this time last year was when I last did a comprehensive check of Freeview transmissions, and since then, much has happened which has been captured (at least partially) through interim postings such as these. It gets a little messy to comb through them all, so instead of issuing an interim Freeview Update, I’ve decided to redo the analysis entirely to take another yearly snapshot.

As a result, I am awake at 4am … writing this post. No kidding.


As most, if not all stations have moved to statistical multiplexing, spot bitrates can vary wildly. As a result, I take a transport stream recording of the whole multiplex for a period of 3 hours (+/- 1 minute) to average out the bitrates with. The recorded TS files are checked for time, and the bitrates are determined by dividing the size of the demultiplexed PID streams versus the record time. Due to slight inaccuracies in determining record times, total multiplex bitrate is likely to vary by a few kbit/s, but each PID stream bitrate should still be accurate to at least the kbit/s level. The streams were analyzed using TSReader.



The service summary table is shown above. As I haven’t kept my eye on Freeview for a while, a few changes have happened:

  • RACING.COM channel’s audio bitrate has gone down from 64kbit/s to 48kbit/s.
  • SBS has stopped using 704x576i and 1440x1080i formats and has reverted to the full-size frame formats. Joint-stereo audio has reverted to full stereo coding, and SBS TWO has been renamed to SBS VICELAND. A private stream has been allocated to Food Network where formerly it was sharing with another channel.
  • TEN HD has moved up to [email protected] profile from previously using [email protected], which should bring more compression efficiency.
  • ABC’s launch of ABC HD using MPEG-4 H.264 [email protected] with a frame size of 1920x1080i, along with Dolby AC3 2-ch audio at 384kbit/s. The News24 service reverts to standard definition with MPEG-2 video and MPEG-1 audio.
  • These changes mean that all HD services are now 1920x1080i at 25fps. No more 50fps services are on the air. Furthermore, all channels except SBS HD have standardized on [email protected] profile.

In regards to channel video bitrate, here is the ranking, sorted by bitrate from high to low:


The three shopping channels (in purple) that still are on the air still consume around 1.7-2.3Mbit/s video bitrate. Unfortunately, when it comes to services you’d like to watch, their bitrates are starting to look less healthy.

SBS HD (in red) is the outright leader on absolute bitrate, but this is because it is still using the older MPEG-2 encoding. This allows for older HD sets to still receive HD, but the price is about half the bitrate efficiency of MPEG-4. Even then, it should compete fairly well with the majority of MPEG-4 HD stations based on the fact its bitrate is almost double.

MPEG-4 services (in green) vary somewhat with respect to bitrate. 7HD and TEN HD both have more generous bitrate allocations, especially compared to 9HD. ABC HD has barely the same bitrate allocation as many standard definition channels. This may change in the future, as at this point, the launch of ABC HD is relatively underwhelming with all the content I’ve seen being merely upscaled SD content.

Looking at individual broadcasters, it seems that Channel 7’s push towards HD leaves its standard definition version relatively strangled with just 2.8Mbit/s of bitrate, and 7 TWO isn’t looking that well with 2.5Mbit/s which is only the same as RACING.COM also carried by Channel 7 but in MPEG-4. The 7flix service fares slightly better with 2.9Mbit/s. The only Channel 7 SD service that has a decent bitrate is 7mate raking in 3.7Mbit/s.

In the case of Ten, the services show more healthy bitrates overall. Channel 9 seems to continue to prioritise and balance the SD service with the HD service a lot better, so that SD viewers aren’t seeing a poor picture, although their secondary services have less bitrate.

In fact, it seems that SBS is the one that is really feeling the pinch, due to their all MPEG-2 services, their channels are towards the lower end of the bitrate spectrum. Food Network is their highest bitrate channel outside of their HD service, and VICELAND is even below RACING.COM and only just slightly ahead of the best shopping channel. This is somewhat disappointing to see.

ABC seems to continue broadcasting a good primary SD service, and their ABC2/KIDS is the highest bitrate SD service at the time of survey. Their other services (ME and News 24) both rank fairly well. It seems their HD service has a more “SD-like” bitrate for this balance to occur.

Overall, it seems that standard definition viewers are feeling the bitrate pinch. Where previously the primary standard definition channels may have carried 4-5Mbit/s of bitrate, the increasing pressure of new channels, the need for MPEG-2 simulcasting and the carriage of “non-essential” channels has resulted in the average SD service video bitrate getting closer to 3Mbit/s.


A closer look at the ABC multiplex bitrate evolution shows that for the most part, the ABC2/KIDS service has seen a slow increase in rates, the ABC3/ABC ME service a slow decrease in rates, and the main SD channel has been holding fairly stable although trending towards a slight reduction. News 24 took a major cut and went back to SD to allow for the launch of ABC HD.


Looking at the per-PID rates, it seems that ABC HD may indeed pick up more bitrate in the order of up to 1Mbit/s as the multiplex has a lot of nulls compared with most other broadcasters. On the whole, mux utilization for all broadcasters was aggressive with the exception of ABC (1.3Mbit/s free) and Seven (0.6Mbit/s free, twice as much as the others). Due to an improvement in determining the recording time, the reported mux bitrates are now much closer – slight variations are likely due to packet loss due to transient reception errors.


The launch of new HD services is a double-edged sword. On the one hand, for those with more modern sets or computer tuners, we are able to enjoy full HD at long last, using the more modern H.264 AVC MPEG-4 codec. However, the need to keep compatibility with legacy receivers capable of only MPEG-2 video and MPEG-1 audio really means that we are still losing some bitrate there. In some cases where the former HD station was not a simulcast, this also meant “splitting” the bitrate from a former MPEG-2 HD service into an MPEG-2 SD service and an MPEG-4 HD service, resulting in more bitrate constraints on both services.

As a result of the aggressive move by all broadcasters (except SBS) to embrace the MPEG-4 encoding, the HD services are able to look acceptable despite using only marginally more bitrate than a regular SD service. The bias towards the HD service is apparent for 7HD and TEN HD, whereas Nine and ABC are both prioritising their SD services instead.

The failure of SBS to transition to MPEG-4 at this time has resulted in their other SD MPEG-2 services being bitrate starved compared to the competition, and overall, bitrates across the board for standard definition services predominantly lie in the 3Mbit/s region where several years back they would have been closer to 4-5Mbit/s. This is not of great relevance where users are watching the HD primary service, however, will affect all users of secondary services carried by the broadcaster. The number of shopping channels is less than in prior analyses, but their existence only exacerbates the bitrate-limited nature of DVB-T broadcast.

The launch of ABC HD itself is a good thing, but at this stage as many viewers have noted, is rather underwhelming due to the broadcast of only upscaled SD content at most times. The service does have about 1Mbit/s of wriggle room, so it could turn out to be quite competitive with the commercial broadcasters’ HD channels, but at the time it was surveyed, this was not the case.

Lets hope SBS transitions to MPEG-4 on its HD channel to relieve the bitrate starvation on its SD services – after all, it is already simulcasting HD and SD in MPEG-2, so it’s really only gains.

This is a dreary-eyed Gough, signing off for the day …

Update: Service Bitrate Evolution over Time

At the request of one my regular readers who wanted to see more service bitrate evolution graphs for the other multiplexes, I went on a rampage to gather all of my data to allow it to be plotted. Of note is the fact that I don’t do “global” surveys of bitrates very often and depending on the situation (impending channel launch), will only survey one mux at a time. As a result, the number of samples for each multiplex varies, as do their spacing in time.

Channel 7

Due to the number of changes in Channel 7’s line-up, I have a lot of data samples for them. The graph shows just how rocky the history of bitrates for the channels has been, at least on Channel 7’s mux. The main station has swung wildly, but overall the trend is downward as of late, as is with 7TWO. At one point, their bitrates were lower than necessary as the null packet fraction was quite high due to a potential misconfiguration after turning off a channel. TV4ME has left the air, and RACING.COM has increased its bitrate marginally. It seems 7flix has also gained in its bitrate, and 7mate as well, whereas 7HD has lost out over time. The big drop in 7mate corresponds to the launch of 7HD. The days of SD services having 4-5Mbit/s seem to be well and truly behind us.


I don’t have much data for SBS, and that is primarily because not much has happened bar for the launch of the Food Network channel. SBS TWO used to command a fair chunk of bitrate, at the expense of SBS HD which did some mode changes from time to time, but has otherwise been restored to its 2013-level of bitrate. The increase in available bits to accommodate the food network was primarily due to a change at the time of launch to their modulation mode to increase available bitrate as their competing channels. All stations now run 1/16th guard interval, with 3/4 code-rate.

Channel 9

Channel Nine’s strategy seems to be relatively conservative. EXTRA 2 got the axe, freeing up 2Mbit/s. Along with the reversion of GEM to SD, this gave the necessary bitrate to support 9Life and 9HD. All services have been very consistent in bitrate allocation at the sample points.

Channel Ten

Ten’s bitrate strategy seems to be a little hill-and-dale. The shopping channel Spree has only drifted downwards marginally, whereas TVSN has sort of gone upwards and levelled out. Ten HD’s bitrate has been relatively stable after the 1Mbit/s “teaser” stage. ONE however, has seen bitrate cuts throughout its high definition life to feed the bitrate increases to TVSN and ELEVEN. However, on the launch of the HD service, the boost to ELEVEN had been cut back noticeably, and the main standard definition channel received its first bitrate cut. This results in all the non-shopping standard definition channels sitting at roughly the same bitrate of 3-ish Mbit/s.


I covered it earlier, but since I found one more data-point, I’ve decided to produce an updated graph. It seems that bitrate movement at the ABC is gradual. The main SD channel has only received a marginal cut to bitrates, with News24 making a big drop as it reverted from HD to SD. ABC2/KIDS had undergone a fairly consistent ramp-up in bitrates, whereas ABC3/ABC ME has trended downwards over time.


I decided to include this particular graph of the now-defunct TVS just for a laugh. Not given the pressures of bitrate budgeting between competing services, its bitrate has been rock-solid right up until it went off the air.

Posted in Telecommunications | Tagged , , , , , , , | Leave a comment

Canb2016: Part 2 – ANU & the Asia-Pacific Solar Research Conference

The main reason for heading all the way down to Canberra was to attend the APSRC. Having never spent any time at ANU, I felt it was pertinent that I take a tour around some of the campus to see what the uni is like.

The Campus


The Australian National University, ANU for short, is in Canberra. The main gate (I believe) is at the intersection of North Road and Barry Drive, where I found the most official looking logo sign across the whole campus. The campus itself is quite dispersed, covering a fair amount of area with many separate buildings.


Some of the buildings appear somewhat dated, but elegant in their own way. This is their Engineering building.


Whereas other buildings on the campus have been renewed and look a lot more modern. This is the Research School of Chemistry which has a building somewhat reminiscent of UNSW’s Law Building.


Some of the buildings I found were just downright quirky. If you thought Keith Burrows Theatre at UNSW was round, you haven’t seen the Haydon Allen lecture theatre, also dubbed “The Tank”.


The School of Art has a very vintage look to it, with a sculpture in front of the building. Adjacent to this is the School of Music which was undergoing renovations.


The university also has a Drill Hall Gallery, with a sculpture that looks a bit like the pages of a book.


Just opposite is the ANU Shop where campus tours depart, but I didn’t bother with that. In fact, there are many ways to get into the campus by foot, and sometimes it just seems that you’re walking from the street into the campus without knowing it.


There are even function centres that can be hired …


… as well as a nearby UniLodge accommodation with some restaurants as well. Being right next to Canberra Centre, everything seems rather convenient.


Just around the corner, there was this set of accommodation blocks. I mistook the green mesh as if it were temporary scaffolding. A very interesting choice of architecture.

2016113007529551 2016113007529553

2016113015419645The set of blocks must have been fairly recent as well, since they had a new loking Schneider Electric KPX padmount transformer. In case you wondered how much it weighs – a hair over 5 tonnes is the answer!

Another interesting thing was the presence of “Cab Spot” signs in some areas, so that people can convey their location with a high amount of accuracy without any ambiguity.


2016120109249765The uni itself is located not far from Black Mountain – the location of Telstra Tower. This tower is quite iconic and houses a number of microwave repeater dishes and transmitting antennas. Rather amusingly, the tree in the foreground is cordoned off with a warning sign that states “Caution! Bees in Lavender”. I’ve never seen that before.

As usual, there are also bulletin boards where university students hang posters and signs. More about that later.


While walking around, every-so-often, you can catch glimpses of the tower which remind you that you’re in Canberra.


If there’s one thing that really stands out, that is how ANU has very much maintained a green campus. Nature is everywhere – this nice boulevard …


… and even a creek running through the university.


It feels like a relaxed place to be, although the students sitting their exams might beg to differ.

2016113015059604 2016120109129750 2016120116569788 2016113015179620

The birds also agree – I managed to spot crows, magpies, rosellas and wattlebirds in my short time roaming around. The magpie was also very nice in sitting still and singing for me for a few minutes.


Their environmental responsibility also extends to this row of solar panels on top of their retail precinct next to their library building.


When it comes to facilities, there’s really no complaints either. We don’t even have electric barbecues on our campus!

2016113008009576 2016113015049591 2016113015139609 2016120109159757

The university acts as a “safe space” for expressing individual thoughts and greivances, and the noticeboards really do show that is very much alive. Even without the benefit of noticeboards, it seems students aren’t shy of putting stickers on water fountains and signs to get their message across – in this case, a parody sticker of Wilson Security’s logo protesting their Naru and Manus Island operations.

2016120109219763 2016120116479774

Canberra seems to have a reputation for being under camera surveillance, and at ANU, this is no different. In many locations, they have disguised CCTV cameras as light posts – mildly effective during the day, but more obvious at night. I suppose it might be a little more aesthetically pleasing than having them obvious.

2016120109259767 2016120109259768 2016120109279771 2016120109289773

I saw some other ANU-specific signs – good to see they use recycled water and a good number of people are cycling around the campus. The ramp load message was rather interesting – it specifies a weight load per area. This might be because they don’t want to have the pavers destroyed, but this is more an engineers’ sign rather than something your average person can work out.

But the sign itself is flawed. How do you determine it? I mean … if I’m an average 70kg human, and I’m standing on a single foot, a back of the envelope assuming my shoe measures about 30cm x 15cm tells me I’m exerting about 1556kg/m^2 of downward force. Does that mean I can’t walk on this ramp? Uhh … I suspect they need to add even more information for us to resolve this uncertainty.


On one day, I spotted the mobile blood-bank collection truck on campus, which isn’t something I see very often …


… and this road safety sign on Barry Drive. It’s fairly old fashioned, but I just like the way it almost dares you to drink and drive.

The Conference


The conference was held in the Manning Clark Lecture Theatres.


2016113008019579The theatres were named after Manning Clark, Professor of History.




2016120311119942The conference itself was an information overload. Session after session of overviews or mostly solid technical information, new ideas and hope for the future. It was both inspiring and draining, as it was three days of 8:30am to 5:30pm of presentations and conversations with like minded people. It was highly enjoyable, and I think I got a lot out of it.


It was a rather interesting experience, even on the first day, when Josh Frydenberg was invited to speak at the co-run Energy Update conference. Just outside the theatres, signs of a protest were already evident, and constant heckling from crowd members including “Knitting Nanna” really put the heat on the minister. I really don’t envy his position, as it’s a tough one to balance as you can’t please everybody.

Regardless, from everyone that I heard speak at the conference, I think the key takeaways from the conference included:

  • The need to do more, and quickly. To arrest global warming will require all the resources we have. This includes renewable energy resources, but also less conventional carbon sequestration approaches. It seems too late to save the planet on energy policy alone.
  • Policy drivers can be extremely powerful in making things work – at least, from the analyst’s point of view.
  • Technology improvements in wind power have resulted in steady improvements in capacity factors and costs for energy – it is very much complementary to solar output and offshore wind appears to be a key growth area.
  • Solar energy penetration has altered the load curve to represent a “duck” curve, where afternoon-evening peak is not really being aided and overgeneration during the middle of the day is a risk.
  • Lack of inertia in present line-interactive inverters leads to challenges in maintaining grid stability, although solutions are actively being worked upon which may mean that inverters may work together in teams to provide grid FCAS auxiliary services (and also, consequently, rely on networking/IoT). Whether they can provide utility-level reliability is yet to be demonstrated.
  • New opportunities in distributed generation peer-to-peer energy trading has the potential to change the business model of distributors and retailers, thus supplying energy at similar costs to end users, but allowing end users to effectively trade energy on an open market – whether they are receptive to change is questionable.
  • Solar cell devices work in perovskites is progressing at a remarkable rate. Unusual behaviours are beginning to find explanations rooted in classical physics and chemistry, and ever-greater efficiencies and cell sizes have been reported.
  • Battery storage is a big thing for the grid. Costs are likely to make it feasible for some applications very much in the “current” time, although how best to integrate this into the grid for other purposes (PFC, reactive power support) is still under investigation. It has a potential to mitigate the intermittency which can cause grid problems, although how to share the costs of such services between the grid provider and the consumer is also a point of contention.

One of the things I saw throughout was a constant focus on capacity factors. It seems that the vast majority of projects are basically “profit driven”, thus there is this issue of making sure everything is used to its maximal ability to maximise the profits. Unfortunately, this way of thinking is one that constrains us dearly, as it means that there is an equity issue to be addressed especially where transmission network capacities are maxed out and certain end-points can’t trade energy due to the prevailing network conditions.

It also results in intermittency being an issue. Where running anything near its maximal capacity, there is little reserve to handle transients or changes in conditions. This is partly why the stability of the grid may be impacted if everything is “optimized” to be “just sufficient”. This is also why I find some of the proposals somewhat scary. Networked small-scale inverters acting as large generators have a slight benefit that failure of any one will only result in small deviations, however, such systems open up new dependencies on networking and “internet of things” which may become vulnerable to even basic denial-of-service attacks.

AEMO’s dispatch market currently focuses on 5-minute intervals, and schedules/dispatches generation to meet needs on this interval. Adding a lot of smaller generators and co-ordinating them is likely to prove onerous, and existing large scale power plants were probably built with contractual agreements to “buy” energy at a certain rate for a certain number of years ensuring the capacity factor. This is essentially energy that has to be bought … so coal it is?

Instead, solar energy at the moment is getting quite cheap. It’s so cheap that I think we can afford not to utilize it all. In fact, I think it probably makes sense even when using cheap less-reliable panels, based on overprovisioning and geographic dispersal. By having a power reserve based on not using all your energy that you can get at all times, it’s possible to “help” the grid along with the extra power. It’s possible to compensate for weather events causing ramping at other solar farms, for example. It’s also possible to utilize the capacity of transmission lines to the maximum, especially if teamed with storage, throughout the day and night.

The issue is that we need to do more, and fast, for our own future. So, I think it’s important to remember that the cost of something doesn’t really reflect its true environmental cost. Coal and nuclear may seem cheap on paper, but they have other costs which aren’t clear. Renewable energy makes comparison harder, since costs are almost all upfront, but generally their impact and running costs are low once installed. I really hope that storage and renewables both get cheap enough to work hand-in-hand to provide a stable-enough energy source that can transition us away from gas and coal fired plants within the next decade. Maybe it’s not the cost and profits we should be thinking about but merely the fact that if we don’t do this all together, we might not be living in the sort of world that we have now, and our future children won’t be able to either.

Wi-Fi Devices on the Network?

Being at the university, having good internet connectivity was a given. On the first day, as the conference provided credentials didn’t work, I had to tunnel through the eduroam system. This was an arrangement that allowed roaming researchers to login to the Wi-Fi network at participating institutions using their “home” institution credentials. It worked by forwarding the Wi-Fi session all the way through to the home institution, and thus was not as efficient as a local login. This did get me online well enough that I could get to my home servers, check everything was okay and transact some data to my phone and laptop.

After the end of the first day, the network connectivity was fixed, so I decided to give the network a scan to see what turned up. I used the Fing application, and decided to catalog the OUI’s derived from the MAC addresses to see how popular certain brands of devices were.


A total of 693 devices were found connected to the ANU_Secure network, of which a majority were Apple devices, with Samsung and Intel taking the next two places and more than 75% of the devices connected. This is a combination of all types of devices – phones, laptops, etc.

The ANU_Secure network allocated addresses in the space, with symmetric 1:1 NAT and firewall. The wireless access points were branded Aruba, whereas the ones at UNSW are Cisco.


In all, the ANU campus is fairly large, and even though some of the buildings were rather dated, the uni seems to be undergoing a renewal of sorts. The campus seems to be very green, relaxing and feature-rich. It’s well situated and easy to get to.

The conference itself was eye-opening and inspiring, even though it was very draining due to the information overload. I think the key takeaway message is that there are challenges towards integrating high levels of renewables and improving their performance, but these challenges are being actively researched and strategies to improve the coexistence of old and new are being developed. The inertia of old-industries and business-as-usual conservatism really does put a brake on the pace of change, but it seems that massive change is inevitable if we are to meet and exceed accord targets and keep the warming and climate change at bay.

The next post looks at some rather interesting infrastructure, namely the ACT Government and iiNet’s CBRfree WiFi network.


A photo of the afternoon sun hitting The Department of Agriculture and Water Resources building facing the Ellery Crescent entrance to ANU as I was leaving the conference for the final time on Thursday 1st December.

Posted in Opinion, Telecommunications, Travel | Tagged , , , | 2 Comments