Tuesday, May 29, 2012

Twitter Improves Site Speed by Dumping Hash-Bangs

Twitter stamp image created for Tutorial9 by Dawghouse Design StudioBack in September 2010 Twitter changed how its site renders by pushing much of the processing to the web browser using JavaScript and hash-bang (#!) URLs. Today Twitter has announced it is essentially dumping that approach:

To improve the twitter.com experience for everyone, we've been working to take back control of our front-end performance by moving the rendering to the server. This has allowed us to drop our initial page load times to 1/5th of what they were previously and reduce differences in performance across browsers.

Surprising no one that I am the kind of guy who would say this: I told you so, Twitter.

The rest of the Twitter post explains why #! URLs are not the best solution for rendering content quickly and consistently. Sadly, not every type of Twitter page will see this update as I noted last week:

Congratulations to Twitter for making parts of its site five times faster than its broken implementation.

Thursday, May 24, 2012

Three Browsers in One: Lunascape

Screen shot of Lunascape's configuration screen where users can change rendering engine.

I like to think I'm pretty smart about web browsers, sporting four on my mobile phone and six on my desktop computer in regular day-to-day use. Heck, I even started the evolt.org browser archive back in 1999 with 80 unique browsers at the time (which I am pimping to the W3C Web History Community Group).

But then this tweet came through my timeline:

How did I miss this?

So I went to the Lunascape Wikipedia article, and then to the Lunascape English site and promptly downloaded it.

It's too early for me to review it, but I did run a quick check to see how it reports itself in the user agent string as compared to the browsers whose core rendering engines it is using. My results...

Trident

Internet Explorer version 9 reports itself as:

Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0)

Lunascape, when using the Trident rendering engine, reports itself as:

Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0; SLCC2; .NET CLR 2.0.50727; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; .NET4.0C; .NET4.0E; BRI/2; InfoPath.3; Lunascape 6.7.1.25446)

Gecko

Firefox 12 reports itself as:

Mozilla/5.0 (Windows NT 6.1; WOW64; rv:12.0) Gecko/20100101 Firefox/12.0

When using the Gecko rendering engine, Lunascape reports itself as:

Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.2.28) Gecko/20120410 Firefox/3.6.28 Lunascape/6.7.1.25446

Webkit

Chrome 19 reports itself as:

Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/536.5 (KHTML, like Gecko) Chrome/19.0.1084.46 Safari/536.5

Safari 5.1.2 for Windows reports itself as:

Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/534.52.7 (KHTML, like Gecko) Version/5.1.2 Safari/534.52.7

Under the Webkit engine, Lunascape reports itself as:

Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/535.3 (KHTML, like Gecko) Lunascape/6.7.1.25446 Safari/535.3

Conclusions

It looks like Lunascape doesn't share the rendering engines for Gecko and Webkit with the browsers in which we've come to associate them.

Beyond that, I have no conclusions. I was just looking at the UA string and figured others might find it interesting and/or useful.

Wednesday, May 23, 2012

Failure of Responsive Design is Why Facebook's IPO Tanked

The Venture Beat article as viewed through a mobile browser.Stuck for ideas for an article? Did you hear that Facebook's IPO isn't netting them enough billions of dollars and so is referred to as a failure? Have you heard about the hot new technique for making generic sites mobile-friendly? Need to get people to click through to your article regardless of its content, requiring some sort of link-bait headline?

This seems to be the recipe for disaster that VentureBeat mixed up for its article What's next for mobile now that adaptive design has failed?, written by the CTO of CBS Interactive. What's so awful is that anyone who uses Facebook both on a mobile device and on a computer can debunk the article almost immediately. Let's recap some points from the article:

Like many other engineering-led cultures, Facebook has embraced adaptive design, also known as responsive design, where essentially the same code can render itself down from a desktop browser to a tablet to a diminutive mobile screen.

This isn't even close to true. Facebook does not use the same front-end code (HTML, CSS, JavaScript, images) to power my experience on my phone as it does the experience on my desktop. I know this because I have to jump from one platform to another to perform some functions only available in one experience. I don't even have to view-source to know this.

[W]hen a full size web page is adapted down to a mobile form factor, it forces a lot of vertical scrolling — even if some components are removed and others are made smaller.

I thought we had debunked the resistance to scrolling. Even though we've known this for years on the desktop, users on mobile can scroll. If anything mobile users have had to scroll far more than desktop users for as long as they've been surfing.

[P]ublishers should embrace swiping. Users are not perturbed at all to see a full page interstitial ad stuck into the mix while paging through content, making the tablet extremely monetizable.

Here the argument against responsive design is clear — scrolling doesn't give sites the opportunity to stuff as many advertisements into content. Sites cannot monetize as easily with responsive web design as they can with interstitial ads in a swipe model. Except this has nothing to do with the technical merits of responsive design, just with a potential revenue source.

After a flood of comments, the author responds with this:

The point of this article is on the limited monetization potential of selling the same experience across different devices.

I propose then that title shouldn't be What's next for mobile now that adaptive design has failed?, but instead could be Limited Monetization Potential of Same Experience on Different Devices. It's not likely to garner any clicks for its headline, but at least the author wouldn't have to endure a public flogging.

Blaming a development technique for a business case failure is silly. Responsive design is just a way to achieve an objective, not a business case itself. Blaming a technique or technology for a business case failure is the same as trying to apply the same hammer to multiple nails (to quote the article).

Using Facebook in the article as a reference point is technically flawed and cashing in on its recent headline domination. It's the kind of article I am obligated to address here so I can put a stake in the ground when clients and partners see this kind of headline percolate up into their view, otherwise forcing me to disassemble it time and again.

Speaking of flogging (I was, a couple paragraphs ago), when I drafted this up last night I had not seen that so many others had written up their own responses. I include links to a couple others for your enjoyment:

Wednesday, May 16, 2012

Responsive Image Chaos

Panel from the comic by CSS Squirrel.TL;DR: This is just a recap of what's happening now. If you are up to speed as of today, you can just skip to my brief opinion.

Background

As I mentioned in my post iPad Retina Display Concerns and Tips, even Apple, with over a year of the Retina Display on the market, had not devised a method to detect pixel density and serve appropriate images. Visiting Apple's site on an iPad 3 first gets you all the regular resolution images, and then (after some UA sniffing) pulls down the high resolution images. This is generally regarded as a bad idea, especially considering people who have bandwidth caps and fees.

The W3C spun up the Responsive Images Community Group in February (I am a member, but fell off after the JPEG 2000 discussion) hoping to get feedback from developers on updates to HTML that would allow for multiple assets to be specified. Ideally, a browser will see all the assets and choose the best fit, preventing redundant downloads. One of the goals was to define a change to HTML that would still keep it backward compatible for older browsers, unsupporting browsers, and browser add-ons (such as accessibility tools).

This was happening in parallel with WHATWG, the other standards body working on HTML. WHATWG, however, seemed to be unaware of the Responsive Images Community Group and the discussions going on over there. Event though WHATWG now has its own W3C Community Group, its purpose is for patent coverage under W3C and WHATWG members are not actively participating, let alone wandering through to see what other W3C Community Groups exist (WHATWG as W3C Community Group in Name Only).

Happening Now

WHATWG has released its own proposal (added to the HTML draft specification) for addressing adaptive images and it appears to have taken nothing from the work being done in the W3C Responsive Images Community Group. Given the bitterness among web developers toward the WHATWG process it is no surprise that there has been an eruption of anger, frustration, and revisiting old issues. I think this tweet sums it up nicely:

People far closer to the issues than I (and far smarter, oddly) have weighed in on this. I don't care to recap it all here because it requires far too much detail and back-story. Instead I have cultivated some posts you can (and maybe should) read to give you the technical breakdown and the process breakdown (see how I am using "breakdown" in two different ways there? not yet? you will):

Feedback from the Community

WTFWG by Tim Kadlec:

The developer community did everything asked of them. They followed procedure, they thoroughly discussed the options available. They were careful enough to consider what to do for browsers that wouldn't support the element—a working polyfill is readily available. Their solution even emulates the existing standardized audio and video elements. […] Meanwhile an Apple representative writes one email about a new attribute that only partially solves the problem and the 5 days later it's in the spec.

Secret src by Jeremy Keith:

But let's be clear: this is exactly how the WHATWG is supposed to work. Use-cases are evaluated and whatever Hixie thinks is best solution gets put in the spec, regardless of how popular or unpopular it is.

Responsive Images and Web Standards at the Turning Point by Mat Marquis:

Participants in the WHATWG have stated on the public mailing list and via the #WHATWG IRC channel that browser representatives prefer the img set pattern[;] […] they understand the browser side better than anyone.

On the other hand, the web developer community has strongly advocated for the picture markup pattern. Many developers familiar with this subject have stated—in no uncertain terms that the img set syntax is at best unfamiliar—and at worst completely indecipherable.

From a post at the W3C Responsive Images Community Group by Kevin Suttle:

The more I think about this, the more I think that this isn't our problem to solve. It's really a browser and server problem.

Media Queries Can't Be Used for Resolution Negotiation by Tab Atkins:

You simply can't make a good decision about which resolution to send to the user based on information from Media Queries. It's fundamentally a hard decision based on information that's difficult to expose in a reasonable manner.

HTML5 adaptive images: end of round one by Bruce Lawson:

[A]s it exists in the current, first draft of the spec, the syntax is revolting:

<img src="face-600-200-at-1.jpeg" alt=""
     srcset="face-600-200-at-1.jpeg 600w 200h 1x,
             face-600-200-at-2.jpeg 600w 200h 2x,
             face-icon.png 200w 200h">

Of course, this can be improved, and must be. This isn’t just about aesthetics. If the syntax is too weird, it will be used wrongly.

I found another tweet (who doesn't like 140 character summations?) that I think captures the gist of the concern with the WHATWG proposal:

My Take on This

I don't care for the WHATWG proposal. I think it's a cumbersome mess with too many places for errors (typos) that a developer won't see without a lot of extra work.

I am happy, however, that there is finally movement. Even movement in the wrong direction can be good in the long term as long as the folks who've blown up about it the last few days can have an impact.

I also don't believe this is anywhere near over (also good). In the WHATWG IRC chat from yesterday, Hixie was open to the idea of writing an article for A List Apart. The suggestion for the piece would be to cover how people can get involved, though I suspect it will tackle some of the technical issues people have with the WHATWG proposal.

Even if it all falls apart, web developers have the power to change the spec by just using what they prefer coupled with polyfills. Since the HTML specification is ultimately intended to reflect what is happening in the wild, it may just catch up. After all, that's the tactic we have used with time.

Update: May 17, 2012

Thursday, May 10, 2012

Exclusion Is a Feature Now

Every day I see examples of web developers allowing their ego to get in the way — trumpeting one browser over another, one technology over another, one methodology over another, and so on. These are typically not based on solid business or technical arguments. This week one stood out to me.

Paydirt

Earlier this week Paydirt proudly proclaimed We don't support Internet Explorer, and we're calling that a feature. The arguments speak more to the ease of the developers than its users. The author cites the difficulty trying to get sites to render pixel-perfect across various versions of various browsers, which demonstrates that he/she is missing the point about progressive enhancement. The writer also claims to be happier for it because skipping the dirty work of IE makes us very happy. Heck, I'd be happy to give up on some browsers, too, but that's not my job.

This proud proclamation comes early:

That's why we made a key decision at Paydirt: we don't support IE - at all - and we don't pretend to. You can't even sign up for Paydirt using IE.

Paydirt notes that it has received exactly zero requests for IE support, angry or otherwise. Given that the site targets graphic designers, developers and copywriters and shows the Macintosh UI in its software screen shots, it doesn't surprise me that only 1.63% of its traffic is from Internet Explorer. That low number of users is a far more compelling case to stop building support for any specific browser and instead to just rely on standards-based development.

However, had Paydirt done that nobody would have cared and there would be no press from it.

Microsoft

Rey Bango works for Microsoft promoting standards and IE's support for them. He's also a member of the jQuery Core Project Team, so he's no slouch on the code side. He tried to access Paydirt with Internet Explorer by adjusting IE's user agent string to get past a browser sniffer. His blog post Hey Paydirt: Your Site Works Just Fine in IE details some of the fun he had:

The signup was painless and I waded through the app with Chrome & IE side-by-side. As I went through, I didn't notice any differences in the functionality. Buttons worked as expected, data was being saved and even panels with fading functionality worked as expected.

He also found some vendor prefixes in the CSS, at least one of which was an -ms prefix. Not a big deal on its own, given how many developers leave code behind during updates, but it's clear that some of the shiny features were relying on browser-specific code that could have been achieved with standards-based markup.

What this response suggests instead is that Paydirt went out of its way to exclude IE. Instead of saving time by just supporting standards and ignoring IE issues, Paydirt spent time to exclude IE. That's not really a good way to argue that you are saving time by excluding a browser.

Some Guy (Bart van Zon)

I had decided that this little dust-up was over. And then today a link to this blog post came across my Twitter stream: Supporting IE Is Too Much Work.

It's a self-identified rant that nicely and succinctly summed up my thoughts (more succinctly than this post you might still be reading). The comments from developers complaining that it's too hard to test in Internet Explorer on a Mac seem to be what pushed him over the edge:

Seriously, if no slick way is a good reason not to test, you should be flipping burgers instead of developing. Your main job is development, not being a hipster. [emphasis added]

He notes that given the market share of Windows, not taking the necessary steps as a developer to set up a Windows/IE environment is unacceptable.

I completely agree.

Related

I have a long history of ranting at lazy developer techniques:

Tuesday, May 8, 2012

Now the Mobile Web Is Dead?

It was barely two years ago that I scoffed when Wired declared the web dead (Enough about the Death of the Web). Fast forward to today and BetaNews refines the claim to just the mobile part of the web: The mobile web is dead.

I am immediately suspect of an article that bases its claims on data to which it does not link. In this case, the article references comScore data describing how little time users spend online within a web browser. This data seems to be a comparison of how much time users spend on Google or Apple properties within an app versus within the browser.

The only comScore piece I could find that fits the premise of the BetaNews article is comScore Introduces Mobile Metrix 2.0, Revealing that Social Media Brands Experience Heavy Engagement on Smartphones. To pull a block of factoid out of the comScore release:

On Facebook, the top ranked mobile media property by engagement, 80 percent of time spent was represented by app usage compared to 20 percent via browser. Twitter saw an even higher percentage of time spent with apps at 96.5 percent of all minutes. In contrast, Microsoft Sites was among brands that saw browser access driving a majority of usage at 82.1 percent.

Let's consider that so much of mobile use is driven by social media that it makes sense Facebook and Twitter are go-to examples. Let's also consider that both Facebook's and Twitter's mobile sites mostly suck (Twitter's new mobile site was not yet live when these data were captured), so it makes sense that their users would prefer to engage via an app. It also stands to reason that Microsoft, not a social media juggernaut, would see more traffic to its web sites than via apps.

The BetaNews article comes to this conclusion from the minimal amount of data:

So outside Wikipedia, the mobile web is dead. It's mostly apps now.

Evidence suggests that some media reserved for apps are moving back to the web (Web journey complete, FT switching off iOS app). Some apps aren't even apps in the strictest sense, but just a wrapper for a browser that loads a mobile-web-version of a site (You'll never believe how LinkedIn built its new iPad app (exclusive)). Even online retailers have started to move from apps back to the mobile web for commerce (Eliminating the speed delays with mobile Web shopping).

I could be accused of cherry-picking examples, but the declaration that the mobile web is dead makes the same mistake and doesn't even cite its source properly.

Apps have their place, the mobile web has its place. As long as there are users and business cases for each, they will both persist.

Related

Update, May 10, 2012

ReadWriteWeb makes an opposite argument: The App is Dead (OK Not Really, But The Browser Is Back). The first part of the title is link bait, and I don't agree, but the rest of the title clarifies it a bit.

Monday, May 7, 2012

New Crowdsourced Translation Option

Ackuna.com logo.Many organizations don't have the budget to guide them through a full translation / localization project, and some don't even know where to start. In late 2009 I wrote about low/no-cost options from Google (machine translation) and Facebook (human-powered): Facebook and Google Want to Translate Your Site

A new option has emerged recently, covered in the Mashable piece Free Online Human Translation Service Takes On Babelfish, Google Translate. Unfortunately the writer of that piece doesn't seem to understand the rigor that has to go into the translation process, so opportunities to provide a deeper analysis are missed in the article.

The service is called Ackuna, a free offering from a translation agency. Mashable's suggestion that this service takes on the two translation giants on which most web users rely is silly — Google and Babelfish provide real-time machine translation. Ackuna does neither. Ackuna uses people to provide translation and does so at the pace of the volunteer translators.

I have already made a case against machine translation for anything other than casual or immediate needs. I almost always counsel my clients against its use, including the free Google translate widget you can drop into a web site. There are exceptions, of course, but that's out of the scope of what I am addressing here.

Because Ackuna uses humans for translation, there are a number of questions that anyone looking to use Ackuna should ask. I detailed a set of questions in my 2009 post, but I'll recap here (excluding the questions regarding Facebook Connect):

  1. Does Ackuna attract users who are fluent in the desired target language?
  2. Are these users willing to help translate your content for free?
  3. Is the translator a subject matter expert?
  4. Is the translator part of your target audience (including geographic and demographic breakdown)?
  5. Are you (or your client) comfortable letting unknown third parties translate your message?
  6. Is time budgeted to identify content for translation?
  7. Is time budgeted to have someone review the translation?

Ackuna's FAQ page answers some of these questions, but doesn't really explain how you qualify a translator. Ackuna's translators are ranked in the site by a combination of user feedback and badges. Think upvotes and downvotes, with points determined by whether or not a translation (or a step) was accepted or not. Badges are awarded based on other translators marking submitted translations as accurate.

When it comes to deciding whether a translation is correct, assuming you don't speak the target language, Ackuna doesn't make any guarantees:

Use a translator's reputation and badges as an indicator of their credibility, and take into account the comments and feedback left on each translation by other users. Use these factors and your best judgment before accepting the translation of your text.

If timing is a concern, remember that translators are providing translations because they want to. The only pay-off for these translators are badges and points. When you have no contract and no way to pressure someone for work, there is no guarantee it will ever be completed. In case you can't wait and decide to walk away with what's been translated so far, from the FAQ:

How do I download my completed translation?

[…] You will not be able to view a completed translation until every segment in your project has at least one translation submitted.

Not being able to secure translations can be a bit tricky, too, especially if some of your content is sensitive or personal. Given this clause in the terms & conditions, you may want to think hard about what you post for translation:

[Y]ou give the right to Ackuna and its affiliates to store your input indefinitely and reuse it at any time and for any purpose at our discretion.

Ackuna needs critical mass to produce good translations (or translators whose profiles don't read like Hipster spam-bots). It needs many translators reviewing each others' work to produce robust translations in timeframes that matter for businesses. Ackuna needs more users ranking one another's work, otherwise it may be too hard to know if that Simplified Chinese translation really conveys your message properly — especially when the translators all have a similar rating. Ackuna's bare-bones interface may not help it attract good Samaritans who just want to translate, since it's not too easy to see all the projects in one pass (you have to page through them) and the search feature doesn't work (yet, it claims).

Ackuna itself is not a bad idea. A translation workflow and process is a necessity in any translation project and Ackuna provides some of that. If you already have translators available to you, it might even make an effective no-cost solution to manage the workflow and get others to weigh in on the work.

What Ackuna could do is counsel its users on what makes good translation, maybe even cross-selling its parent company's services. From there it should group translations into industries or subject matter so that those with experience in them can find content more relevant to their skills. In addition, finding a method to indicate a translator has a specific industry or region expertise and provide a ranking system for same can go a long way to helping a user understand if his or her translation is as good as it could be.

I want to be clear that I am not criticizing Ackuna (though I could be criticizing Mashable's presentation of Ackuna). Providing a free service for something so rooted in the complexities of human language goes beyond what its technology can do. As I have commented before about free services, you get what you pay for.