Wednesday, October 22, 2008
I've broken down and starting using Twitter, largely to eliminate the habit of finding a link on the internet and using work email to send it to my private email. Also so I can follow people smart people who rarely actually make blog updates. My account name is Bwreilly. At one point I will probably collect the links I wander into and repeat them up here as it is an embarrassingly easy way to make a post and also because the links themselves are often of interest and it would be a shame to deny them to people who disdain Twitter.
Tuesday, October 21, 2008
Here's a quick timeline: ~2000: Anyone who could afford a DVD drive for their computer could rip movies with some minor hassle with free tools floating around the internet. ~2003: The free tools become easily usable by anyone with a brain stem and interest in doing so. The programs crack DVD encryption methods with frankly embarrassing ease and speed. 2007: RealNetworks tries to make a legitimate tool for ripping DVDs while leaving in some DRM (i.e., restrictions on use). 2008: The Motion Picture Association of America (MPAA) sues them for aiding infringement. The EFF files a brief supporting RealNetworks position, basically claiming it is fair use to copy your own DVDs and people have been doing it for almost a decade. Today: MPAA mocks the EFF for "living in the past". So, with dozens of commonly used programs out there for backing up DVDs - a perfectly reasonable thing to do, since the lifespan of data on commercial hard drives is basically infinite if you back it up properly - the MPAA decides to target one that actually keeps the encryption that they original put on the disks. Their definition of the past is pretty funny too, since (eight years ago/today/in the foreseeable future) DVD backup (was/is/will continue to be) easy and legal under the terms of fair use and really, really easy. I'm kinda surprised RealNetworks even found a market for it with the number of effective free tools out there.
Monday, October 6, 2008
Saturday, October 4, 2008
Cloud computing has been the hot topic in many GIS circle for the last year or so, largely for the same reason it is building steam more recently in most IT circles in general - datacenters and bandwidth speeds are nearing the point where the promise of mass cloud computing is feasible for corporate users (which is where the money is). For anyone previously not versed in this topic, cloud computing is basically the movement of applications, services, and data from local storage to massive datacenters run by people like Google, Yahoo, Amazon, and Microsoft. You probably use it already - say if you use GMail rather than a local application like Outlook or Windows Mail. Maps have obviously moved there too. If you own a computer you probably use some kind of online map for directions. It doesn't necessarily need to be in a browser either - Google Earth and NASA's WorldWind might be local applications, but all of the data and services are running off of datacenters somewhere else. It is believed this somewhat slow progression is going to accelerate as activities typically preformed by local IT departments for small and medium-sized businesses are increasingly replaced by cloud apps offered for basically free by the above organizations. It might seem foolish to marry yourself to a particular platform or business in this fashion but (1) a lot of these things are built on open standards like LAMP anyway and can be transferred around and (2) companies marry themselves to a vendor all the time (see SAP). Traditional desktop software vendors are shifting to do their stuff at least partially as a service (and thus online). Windows 7 isn't going to have a mail program, it is going to fill that functionality with Live Mail. ESRI, the biggest GIS software vendor, has made it a point to make it extremely easy to put online data services into map documents just as you would add in layers on a local computer. A great number of GIS data providers, largely governmental, are not by and large going to venture into the cloud all that soon, not without intervention by legislators. Why? Liability, tradition, data sensitivity. What data public exists is often of variable quality, especially when overlaid with other information from other sources. Throw some bad data out there, even if you include metadata that includes a hefty disclaimer, morons will still use it to hike into a blizzard and and sue you for having to eat their children. Even with good data, there is the problem of interpretation. Take a parcel layer from any given city/county government in the United States and throw it on Google Earth. GE does a pretty good job, but I would wager good money the satellite imagery isn't accurate to a quarter of an inch. The parcel layer is, by law. The number of people who will go on to Google Earth and stir up property disputes without this knowledge is probably substantial enough to be a factor in deciding to release it. And of course there is the security issue. Knowing what substation can black out a particular city block, what water main is feeding that block, communication lines, emergency vehicle GPS locations - this stuff could be used not just by some existential terrorism threat but by normal criminals to cause all sorts of mischief and evade capture.