So, are you underwhelmed or what? Of course, I’m talking about the new re-designed Digital Digest homepage (and a couple of other section home pages, as well as the top navigational bar), that “little project” I first hinted at a couple of weeks ago (I told you it was “little”). The old homepage had been designed for yesterday’s lower resolution monitors, and so it was a bit too narrow and a bit too long. It’s now been simplified to highlight the most important stuff, namely news and software updates. And there’s also a new News section to go along with these changes (as well as changes to the actual news pages). The top navigational bar, and even our site logo, has also been renovated, with the top bar taking a little bit less vertical space, and reducing the number of links shown there by highlighting only the most important sections. A few pages remain unchanged, as I ran out of time with my self imposed deadline of yesterday, so I’ll slowly re-work these pages to get them into line.
Despite a busy week in which I also had to exchange my Samsung plasma TV for a new one due to an unrepairable fault (so getting a nice and new 2012 model 60″ on Monday as a no-cost replacement, which is an alright outcome), I also managed to write the March 2012 US video game sales analysis, which you can read here. Nothing too surprising, with the Xbox 360 still leading, although the PS3 is catching up a bit. Mass Effect 3 dominated the month and sold 4 times as many copies on the 360 than on the PS3 (in the US, at least).
Onto the news roundup then …
Starting with copyright news for the week, Google guy Sergey Brin issued a warning this week that web freedoms are in peril, thanks to increasing attack by interested parties.
Those interested parties of course include the entertainment industry, who are increasingly painting themselves into a corner as the enemy of the Internet (which I personally don’t think is a very good strategy), governments who do the bidding of the entertainment industry, and perhaps controversially, the likes of Apple and Facebook, according to Brin. The inclusion of the two tech giants, and Google competitors, may seem a bit cynical, but Brin’s main point is that the closed, proprietary nature of Apple/Facebook (probably needs to add Amazon to the list too) means that they have full control of what can and cannot be done on their platforms, which goes against the principles of the open web. But while Google embraces open source and should be commended for it, anyone who makes websites will know that Google themselves are not exactly that transparent when it comes to a lot of issues, and their practical monopoly on the search market gives them the same sort of power that Apple/Facebook derive from having more proprietary platforms.
With Apple under fire for eBook price fixing from the DoJ, and Apple firing back by calling Amazon a monopoly, a lot of what Brin is saying does make a lot of sense. The role that DRM plays in all of this is actually quite interesting. Even though it was originally designed to prevent piracy, DRM these days are far more effective at solidifying monopolies and preventing competition. By locking proprietary formats to hardware platforms, and tying DRM to these proprietary formats, it all sounds a bit more sinister than simple copyright enforcement. Most publishers are stupid and paranoid enough to actually want the DRM, but this insatiable appetite for unreliable technology is also driving out the small players from the market, one such small player revealed this week. The cost of DRM, the actual financial cost, is quite large for a new start-up – often in the tens of thousands, not even including the technical knowledge requirements. This means that as long as publishers are still keen on DRM (to offer them that false sense of security they crave), it benefits the big guys at the expense of the smaller players, and the monopolistic situation this creates in the end probably hurts the publishers more than had they not used DRM (DRM mostly only prevents casual piracy, the type where people share the same eBook with friends, as opposed to straight up piracy where pirate groups can easily circumvent DRM and upload the content online for all the enjoy).
Good news, that may soon turn to bad in Australia – our second largest ISP here managed to actually win a copyright case against the Hollywood-backed AFACT. So for now at least, ISPs have been found to be largely not responsible for the actions of its subscribers. The High Court also found that ISPs does not need to deal with infringement notices that are not accompanied by a court order – going totally against the precedents being set in other countries, where ISPs have been made the scapegoats in the war against piracy. And as this was a High Court decision, the highest court in the land, the win is final, and no more appeals can be granted. An obviously embarrassed AFACT, who have long been accused of taking orders directly from the MPAA (with Wikileaks documents showing that’s exactly what happened with this legal case), will now deploy a new tactic. They have blamed the existing copyright laws for not being biased enough towards rights holders, and want them changed so that, in the future, they could easily win lawsuits such as this one. This is where the possible bad news may come from, as the government bails out the AFACT by implementing new laws. So it’s just like that old saying, if you can’t beat them, have the rules changed so you can!
I’ve always felt that making ISPs liable for the activities of their users, especially when most ISPs don’t even have the capability to monitor the user’s downloads, was suspect. I’m not quite sure if the phone or electricity company analogy fully applies to ISPs, but I don’t think it’s that far off. These companies, like ISPs, provide a service to users, and users are liable for how they use the service. The only difference is that ISPs are made to be different under the DMCA (there’s no DMCA or equivalent for the phone company, for example), but they really shouldn’t be. The content holders will argue that it’s much easier for ISPs to spy on their subscribers and to stop their illegal activities. But just because it’s easy, does it really mean that it should happen? By all means, the ISP should take action if there’s a corresponding court order, but for “infringement notices”, which are merely untested allegations, why should the ISP be liable for something that’s hasn’t even been established to be illegal yet?
There’s probably a better term for it, but for me, the issue of web piracy has suffered from a lot of “legal slippage”. What I mean is that, because the problem has been so widespread, and the impact of the problem so exaggerated by the usual suspects, there’s this acceptance that corners need to be cut in order to “streamline” the legal process. So due process is out, and even basic distinctions like “evidence” and “proof” has been blurred to the point where “allegation” has become “guilt”. The lobbyists have pushed for this outcome, the government has been supportive, and the tech companies have been scared into accepting it all. Which is why it was disappointing, but not too surprising, to read RapidShare’s manifesto on “Responsible Practices for Cloud Storage Providers”, a defeatist piece of article that signals the surrender of the cloud storage industry to the power of the entertainment lobby. According to the manifesto, of the various aspect of DMCA takedown request, including its validity, the only factor that actually matter is the actual formatting. If it’s properly formatted, then RapidShare says that the takedown request should be deemed valid, even if it’s for something ridiculous like removing open source software. So, in RapidShare’s eyes, it’s perfectly reasonable for me to get a competitor’s RapidShare account closed down as long as I submit a *properly formatted* DMCA request, and if my request turns out to be invalid, then it’s up to my competitor to prove that it is (in RapidShare’s own words, it’s up to the user to explain “why the suspicions are unfounded”), and for them to take legal action against me for filing a false request (if I was stupid enough to use my real name in the first place). This must give the business users of RapidShare real confidence in the reliability of the service.
Worse yet, those same business users that use RapidShare to privately store and share commercially confidential information should be even more worried about RapidShare’s stated policy of reserving the right to “inspect” the files for users who have failed to prove their innocence.
What, no strip searches?
Nothing much happening HD wise, although this one story about Sony’s upcoming archival storage format was interesting, mostly due to the reaction to it.
Sony’s announced a new archival storage format, based on the Blu-ray format, that aims to offer 1.5TB of storage. It does it in a pretty old fashioned way, by putting as many as 12 discs into the same cartridge unit (so 12 times 125GB BDXL equals 1.5TB). By the time I saw the story, it had already gathered a lot of attention, which I thought was weird for a product aimed mainly at broadcasters and corporations. A lot of the comments were the usual “why pay $$$ for this when you can get a 1.5TB HDD for $79” and the like, and as much as I like to bash Sony, and I really do, I felt compelled to write the story just so I can clarify a few things here.
A 1.5TB HDD will definitely be cheaper than Sony’s proprietary drive and disc cartridge system, but they’re for entirely different purposes. For archival storage, data retention is everything, and an active system like a hard-drive with mechanical bits and bobs is not best. So optical discs do have a few advantages here, and you can’t really blame Sony for using Blu-ray as the basis of this new format. Putting 12 discs in a cartridge may seem like a “dumb” solution, especially since it appears the cartridge simply act as a carousel system, and doesn’t allow for parallel writes and reads (so only one disc is extracted from the cartridge and written/read at a time), but for archival purposes where you’re only likely to write to disc once (and, if things go well, never actually read the damn thing), it gets the job done. Using an active hard-drive for archival storage is suicidal, unless it’s part of a well maintained array of discs, which doesn’t seem to make much sense from an economics point of view.
I also did a bit of research to see if SSDs are more suited to these kind of tasks, considering mechanical drives are on the way out – but SSDs data retention may actually be worse if you don’t get the right type of drive, as the electrons used to “store” the data may leak to the point where the data simply disappears (this info comes via a web forum, so it may in fact be made up). So it seems optical discs, at least for archive purposes, do have a role to play, although whether they’re better than current tape based systems, I don’t know and really don’t care to know since I’ve already spent way too much time researching and writing about something that’s not even remotely interesting to most people.
But for those that are interested in these kind of things, here’s a forum thread that may help you waste a few hours of your life.
That’s probably as good a place to stop writing, so I can stop wasting your time, but mainly because I’ve run out of things to write about. See you next week.