WordPress’ freemium model fails

Today, I noticed for the first time that WordPress would now like to place ads in combination with my content. The only way to avoid this is to pay them, and that’s not what I blog for. I’m also not inclined to place my own advertising to cover costs, so I’m left with no choice but to conclude that freemium is dead, and that I should leave this plane in favour of a private blog somewhere. It’s a shame as the keyword economy is thereby broken. Is there anyone other than Technorati to manage keyword clouds? (Google and hence Blogspot are evil, so I won’t go there.)

Privacy and freedom with subscription and cloud

There are certain risks associated with sending your data up to the cloud for safe-keeping. Chief among these, some would say, is privacy. If your data is not encrypted in the cloud with a passkey that only you know, or have access to, then you don’t know who’s reading. This fear was once again awakened recently with the realisation that the US government was running PRISM and accessing ISP data more or less directly. What else could they be plugging into? I believe we will find that the US government had direct access to all of Facebook, as well as other sites. The corresponding denial by the companies involved seemed rather well-coordinated.

How else are we putting ourselves at risk, and can we ever be sure that we’re not getting taken for a ride when using computers? While some had argued that cloud based services would be a road towards subscriptional enslavement, this apparently has not happened, with Google Reader being shut down rather than being made a subscription service, and Picasa Web images being pushed to Google Plus. Another example would be Yahoo! Mail, which silently opened a previously paid-for IMAP interface to customers some time before March 2011.

In other recent news, however, Adobe is following in the footsteps of Microsoft and converting its Creative Suite (including the popular Photoshop) to a rental model that requires that computers connect to the internet at regular intervals. Adobe products had been known to “phone home” for some time. For consumers, such home-phoney activities carry the uncertainty of exactly what data is transmitted. In fact, few of us actually check our outgoing connections sufficiently meticulously to capture encrypted privacy-violating rogue packages that some softwares may send. This is even more true with our mobile phones, an area in which 100% open source solutions are still limited due to the relative lack of open-sourced apps.

To be rigorous about privacy and economic-freedom-to-perpetually-do-as-you-please-with-software-you-paid-for, locally hosted open source software is the only way to go. And if you want to be really rigorous, you must read and comprehend the entire source code of everything you use. But is that still possible? Haven’t we gotten to a place where that source code is Too Long, Didn’t Read (TLDR)? Instead, in my impression, we treat open source developers as saints, presumably due to the realisation that if they’re anything else, we may still be screwed. Code review helps, but how do you know the review coverage of the code you use? Did one person review it after it was written, or ten? Who were they, who do they work for, and how smart are they? Maybe some of them work for the NSA. Perhaps the best idea is to stay offline except for occasions where you know explicitly why not to, and the online activity is in fact essential. Humans have a tendency to destroy resouces they depend on. Agriculture helped us grow our populations beyond the point naturally sustainable purely on gathering and hunting activities, and we subsequently hunted and collected many species to extinction, and continue to do so today. We drilled a hole in the ozone layer, poisoned fisheries with oil spills, are wreaking havoc with naval sonic technology and fracking, are too undisciplined to use risky nuclear technologies safely, stir hatred with drone-mediated killings of civilians in countries far beyond our bounds, and may be raising sea levels with the consequence of threatening the homes of hundreds of millions of people. Why should the Internet be a safe and enduring place?

Apple and Adobe

Much has been written lately about Apple falling out with Adobe. Here’s my version of that story.

Apple is always keen to get its customers pushed forward to the next release of their operating system. In the most recent case, they offered a very cheap mini-upgrade which didn’t even change its name all that much (Leopard to Snow Leopard). This is important for Apple, because it makes life better for developers coding for the platform, and, remember? Developer, developers, developers, de… you get the picture.

Adobe put a dent in Apple’s plans when they caused a kerfuffle about possible incompatibilities of CS3 or even CS4 with Snow Leopard. This may have put some customers off upgrading, and detracted from Apple’s long-term strategy. I could fully understand this if Adobe felt they had been loyal partners to Apple all along, and were now being left out on the iPhone. Perhaps Adobe never really understood, or perhaps had no tolerance for, the fact that Apple’s reasons for leaving out Flash are far deeper. The reason is myspace, youtube, and facebook: music! Apple budged a little bit when offering H.264 encoded videos in a special Youtube application, but the key here was that Apple could keep tabs on what was and wasn’t added, and could prevent both hit singles and blockbusters being shared free of (its) charge.

Apple knew that Adobe would be mad that Flash isn’t going to be on the iPad either, so Steve decided to lead a pre-emptive strike by shouting very loudly about how terribly Apple has suffered at the hands of evil Flash (my observation is that no browser crashes as frequently and effectively as Safari, and to make things worse, it doesn’t have any recovery either, but that’s just a by-the-by for Steve’s personal introspection).

Meanwhile, the fact that Adobe keeps a product-for-product edge in market share over Apple’s Pro apps (Aperture, Final Cut Studio, Logic Studio) would be that Adobe also serves Windows customers. So in spite of the fact that Adobe’s files are not always compatible cross-platform (well, neither so are Microsoft’s), they do at least give themselves this appearance. How to describe Adobe’s position? Well, it needs to make sure that it is tolerated on the growing Mac platform. However, at least for the time being, Apple can’t afford to lose Adobe. Adobe stands to lose the most coercive product from its lineup if the web of the future doesn’t do Flash, but if youtube et al found their way unencumbered onto Apple’s mobile devices (via jailbreaking perhaps), Apple would lose a big chunk of its iTunes store revenue. So these two companies can hurt each other a lot, and at the moment it doesn’t look like Apple is going to play nice (all the while taking the mickey out of Google – not that I have any sympathy for that latter company given its recent anti-privacy antics).

Book review: The Art Book

I recently picked up a copy of The Art Book at a very reasonable price. It’s basically a list of major artists, one a page, with an article about a “typical” piece of work of the artist, usually a well-known one, with the bottom three quarters of the page displaying that work of art. Each article starts with a description of the work of art, and then moves on to general observations about works by the artist, sometimes with side notes about his life (rather unavoidable in cases like van Gogh). In all, 500 artists are featured, including sculptors (e.g. Rodin) and less categorizable examples (Koons, Merz). Having been shown in an art gallery may have been the criterion.
Entirely missing are Bouguereau and Spitzweg, a slight pain given the recent interest in Biedermeier art. Similarly difficult to explain is the absence of Riemenschneider, the unsurpassed genius of woodwork, while his contemporary painter colleague, Cranach the Elder, has been included (although not his son). The general impression is that while other forms of art are included, the focus is still firmly on works done in the plane, especially painting (photography is entirely absent, but Gerhard Richter is there). I had previously thought of Audubon as a highly accomplished book illustrator, but he is there, testimony to the focus on painting. Furthermore, the German traditions may have been slightly underemphasised. The Young British Artists are thankfully missing, although their predecessor, Josef Beuys, is there.
Artists are presented in alphabetic order, with cross-references suggested. These cross-connections are not always comprehensive, unfortunately. There is no link, for example, between Miro and Klee. The glossary at the end of the book is basic, and was of little utility to me, but will probably help complete newbies.
Now, what could be the value of such a comprehensive selection that gives little attention to each individual artist? For me, it makes an excellent internet companion that gives me all the essential artists that I may have missed, along with suggestions of other artists to explore. After a while, I got tired of the over-abundance of boring modern paintings which, once the experience has been had, need no longer be preserved – we have Kline, Klein, Hodgkin, Heron, Hayter, and that’s just two letters’ worth. I graduated from the book with a strong yearning for beautiful art, accomplished in craftsmanship as well as composition and aesthetics. All in all, it is a very good selection that leaves little to be desired, and a highly useful reference work at an affordable price, and given it was published in 1994, it is incredibly up-to-date and comprehensive.

Phaidon Press, ISBN-10: 071484487X, ISBN-13: 978-0714844879.

Visitor tracking – server or JavaScript?

I was annoyed a few years ago when my hosting provider decided to remove access to the server logs. Previously, they’d provided nice pie charts as well as raw access data indicating which parts of a site were more popular. Importantly, because it was server access data, it didn’t matter whether the requested items were html pages, jpegs or other downloadable items.

The removal of this feature seems to have happened at quite a few hosting companies, otherwise how to explain the demand for Google Analytics et al.? Leave the hard work for the user’s browser, they said, and send the data to a separate, dedicated server. Server speeds have increased during the same time, so why were features taken away from users? And why, indeed, hand data over to Google et al., when you could get it from your server host?

Of course, we all realise that this is why we’re now all crying for faster multi-tabbed browsing, a demand that may have helped Google to elbow its way into the browser war theatre with its new offering, Chrome.

It’s quite possible though that the penny-pinching users themselves were at fault, unwilling to pay extra for the access to their server logs – driving down the price of web hosting by seeking out cheaper and cheaper hosts, in utter disregard of the reliability of service, and features only required by people who actually care about who looks at their web pages.

While I don’t have any data available to me right now to determine the exact root cause of the current situation, I do want to point out the duplicated effort: Web browsers do send their user agent string, identifying the exact browser version and operating system, with every http request, and in the default server configuration, access logs will usually be kept by the server, of at least the user’s IP, and the file path accessed on the server.

How many of you really feel that you need more data? Do you need to know your user’s screen resolution? Do you actually compute heat maps for your pages? Have you ever used analysis software for raw server logs, and what is wrong with your hosting company providing web access to analysis tools that are solely based on server logs?

Looking forward to your comments.

Verified by Visa scandal

Verified by Visa is a mechanism by which the identity of a Visa card holder making an online payment is supposed to be confirmed. In order to pass this additional verification mechanism, the user has to enter certain digits of a previously set password. If the user cannot remember the password (or does not know because he has fraudulently obtained the card details), he has the option of clicking “Forgotten password”, after which he will be allowed to reset the password entering only information that is actually printed on the credit card, plus the date of birth of the card holder (which would be printed on any other document that would have been found in your wallet, for instance). I’m not impressed – are you? Think about it, in a password of 6 to 12 (iirc) characters, including at least one letter and one numerical digit, can you remember which is the 9th character without writing the whole password down? Not convinced that a coherently thinking individual was behind this idea.

Apple’s Mac mini timing, or, where is the spec bump?

A lot has been written about the Mac mini’s imminent demise, since at least May 2007, and again recently. I’ve never been particularly willing to believe this speculation, because the Mini is the product most in demand for shared desktop computing facilities in educational establishments. In the university I am most recently familiar with, this accounts for at least 100 machines in the libraries alone. That’s not including what individual departments may have in their computing labs (another 100 to 300 maybe?), or any orders by individual researchers, particularly where low spec machines are desirable, e.g. for grad students (scientists sometimes opt for iMacs because they have enough cash, and spare screens tend to be consumed by Windows machines or dual head set-ups; the main opportunity to target here is students who are “visiting” a lab for small projects, either undergraduate or Masters research projects, or grad students visiting from other universities, typically abroad; a completely untapped opportunity is arts students and staff, for most of whom any word processing machine will do, so why not buy a cheap Mac?). If you scale this to the number of universities in just the English-speaking world alone, you can clearly see a market of a size that Apple would want to harvest. In addition, in a “catch ’em young” world, Apple cannot afford to lose those markets – or the revenue it makes from more unusual applications of the Mac mini, such as server farms. The compact size of the Mini remains quite competitive, in spite of being somehow spared the slimming frenzy that Mr. Jobs put the iMac, Powerbook/MacBook Pro, and lately MacBook lines under. I suspect the reason why this myth remains popular is that these educational markets are to a large extent invisible to the tech writers, who tend to focus on street and internet retail rather than large corporate/educational orders or wholesale.

It does not need saying that the anticipation of a longer recession will spur sales of low spec machines, a job description superbly fitting for the Mac mini in its current incarnation. Nonetheless, it may be true that Apple has decided to delay a spec bump until after the holiday season, to not steal the show from its re-engineered laptop line. Remember that at 1.31kg (2.9 pounds), the Mac mini is among the most portable non-laptop computers ever, and will give you much joy as long as you have a screen available in each location you want to use it (e.g. home and office; I also recommend buying a second power adapter as these are somewhat bulky, with attendant unwieldy cables, and take away from the weight advantage; final word of warning: it’s not entirely designed for being lugged around, so do treat it kindly!) So it would be a shame for it to go, and possibly too great a loss to AAPL for them to really consider this step.

A recent macminicolo article has outlined that company’s reasons for believing in a refresh of the Mini, and a response from Apple Insider points to the possibility that the Mini is an efficient way for Apple to divest of old component stock from other product lines (in this case, possibly the Core 2 Duo chip, but it’s not the only candidate I can think of, with the move away from Intel integrated components). As far as the rumours of the demise go, I can only agree with the above-cited articles that the mini is here to stay for some time yet.

Update 2008/12/17: Further evidence that Apple is making the right decision. Interesting tidbit is that the Mac mini has continued to be one of Amazon’s top five selling items, apparently all the way through 2008, in spite of the ageing hardware!

Four features Opera needs

I’ve been thinking about this for a while, and finally decided to make it public. Opera is possibly the fastest browser currently out there – certainly the fastest available in a final release. There are four features that Opera needs so that I would use it. “Bookmark all tabs”, a dropdown selector showing the title of all the tabs (like in Firefox; I know about the tab preview, but gliding over the tab bar is cumbersome), a scrapbook like Firefox’ scrapbook extension, and good ad blocking. As it stands, I’ll probably be swayed to continue with Firefox if they live up to the 3.1 speed promise in the final release. Otherwise it will be a tough call. Hmmm… Webkit? V8? Except for Opera’s exceptional security track record, those other contenders offer nearly the same features.