Thursday, December 17, 2009

New Tool for Determining Browser Viewport Size

Nine years ago I had become fed up with trying to explain to clients, users, friends, co-workers, and strangers that screen resolution, browser chrome, and browser size combine to create some unique viewport sizes. What this meant was that whether a user had a display at 640x480 or at 1,024x768 was irrelevant if both users had their windows set to the same size. Factor in toolbars, scrollbars, add-ons, and other configuration settings and you had a wide range of viewable space for your web site design that was only loosely related to the actual screen resolution of your end user. I distilled all this down in an article, Real-World Browser Size Stats, Part II (part I described the code I used), written back in 2000 that provided statistics on the viewable area in a browser per screen resolution setting.

Since then many others have started reporting on screen resolutions but few have taken it a step further to report on actual viewable area within the browser. Too many of them, like this chart on the W3Schools site, list the screen resolution trends but neglect to mention that once you pass 1,024x768, users are less likely to surf full screen. Human Factors International even cited my article, but got the conclusion wrong (claiming that as users got more screen real estate they opened more pages).

Today Google has gotten a little closer to getting the point. In a blog post introducing this new tool Google mentions that users were looking at Google Earth, but just not downloading it. After some testing they found the that download button was displaying off screen to the right (outside the viewport) for some users. So they built a tool to help show the breakdown of viewport sizes of users visiting the site as a contour visualization. After seeing some value in this tool for web developers in general, they modified it a bit to make it possible for a user to interact with the page under the contour map. As the mouse moves across a page, a small block of blank space is left around the cursor so that the user can click links or fire hover actions.

Contour map of viewport size statistics.
Contour map of viewport size statistics.

Head on over to browsersize.googlelabs.com to try out the tool for yourself. When you get there you will see a sample site is already loaded up. The percentiles used in the overlay are pulled from the latest data of visitors to Google.com, so don't think that they numbers necessarily apply to you (my own experience is that for targeted sites they may not). Type the address of your web site (or any site you want to test) into the search box and submit. Give it a little time to load and you are ready to start looking. You can now see what parts of your site are visible (or cut off) at what viewport sizes.

Beware, however, that this tool isn't foolproof. For liquid sites or for sites that float in the middle of the page you may find the numbers don't mesh up as you'd expect. In those cases you can scale your browser window down (width and/or height) to at least get an idea of how it might appear at a specific window size. The numbers on the contour map also don't correspond to known screen resolutions, so you'll need to compare them to other numbers just to be certain you have your data points right. The tool also doesn't account for scrolling, which we all know users do. Instead it shows you how a page might look simply at first load. Because this is a freebie from Google Labs, consider the documentation sparse (which isn't a criticism).

Amazon.com as viewed using the Google Browser Size tool.
Amazon.com as viewed using the Google Browser Size tool.

Tuesday, December 15, 2009

More News in the URL Shortener Market

Back in October I commented how the list of URL shorteners has gotten even shorter (or shortener, as I liked to call it). As bit.ly rose to the top thanks to Twitter, Tr.im and Cli.gs called it quits. Things have changed a bit since then.

Recap and Updates

Tr.im

Tr.im took back its statement of impending doom when the blog was overrun with support, was approached with an offer from bit.ly (which they declined), announced it was going open source three months ago, and has been silent since.

Cli.gs

On December 1, Cli.gs was acquired by Mister Wong, and continues to provide its URL redirecting/forwarding service. What else it will ultimately provide is anyone's guess.

What's New

The market for shorteners is not dead, however. The argument can be made that bit.ly survived simply because Twitter standardized on its platform for tweets, known for their 140 character limit. Some of the big boys now have an interest in this game, and have considerably more resources to bring to bear to bolster what might otherwise be a losing proposition.

Goo.gl

Yesterday Google announced its own URL shortener, Goo.gl. In this case, it is not a stand-alone service, it can only be used from the Google Toolbar (for your web browser) or from FeedBurner (their RSS aggregator). They do not exclude the possibility of opening it up to wider use in the future. Google claims the service will provide stability (big fat data centers), speed (big fast data centers), and security (URLs will be sniffed to look for spam/malicious sites).

fb.me

If you think Google was ahead of the curve, you're wrong. They were responding to Facebook. Facebook announced yesterday that it was testing its own shortener, fb.me. While Facebook cites the change as a way to stay more open and connected, they are interested in the analytics data they can glean if they own it. Facebook is also smart enough to make sure existing links, such as http://www.facebook.com/aroselli, will work at the new address, http://fb.me/aroselli (it's worth noting that you need to be my FB friend for these links to work — sorry). Right now Facebook is using this feature to automatically shorten URLs shown in the mobile interface.

bit.ly

But the day didn't end there. Bit.ly decided they, too, had an announcement to make for their new bit.ly Pro service. Because bit.ly has traction already with its analytics service, because it's integrated with Twitter, and because it's a stand-alone service, bit.ly doesn't need to worry too much about losing its position just yet. This move, however, entrenches it laterally. They have partnered with AOL, Associated Content, Bing, Clicker, The Daily Telegraph, foursquare, GDGT, Hot Potato, The Huffington Post, IGN, kickstarter, Meebo, MSN, /Message (Stowe Boyd), MTV Networks, The New York Times, OMGPOP, oneforty.com, The Onion, slideshare, someecards, Stocktwits, TechCrunch, The Wall Street Journal Digital Network and blogger Baratunde Thurston (baratunde.com). The example they give is nyti.ms. The example I have already seen out there is Foursquare, which you may have seen in updates from friends. My recent check-in came through in my tweet as 4sq.com/8XoZwz. Just in case you don't believe me, bit.ly provides the analytics on that link.

Uh.oh

While all this develops, I still echo my concern about URL obfuscation and link rot. It's too easy to hide the true destination of a link when you mask it using a shortener. Google may think it can defeat that, but I am suspect of that claim over the long term. Links may also go away, but the shortener doesn't know it and may not pick up the redirection instructions before they are shut off. Over the next couple years we'll see just how much link rot abounds on the web as we find ourselves constantly following shortened URLs to broken pages or porn sites.

I will repeat myself from my first post on this topic: If the Mayans had it right, this could be the End of Days we're all expecting in 2012. Prepare yourselves for the Great Linkpocalypse.

Update: November 18, 2013

"y.ahoo.it URL Shortener End-of-life Announcement"

Monday, December 14, 2009

Telling Clients They Are Wrong

If you have spent time as a solo web jockey or your job has you interacting directly with clients, you've probably been faced with the client who asks for something you feel is wrong. If you're new to this, it may seem like a dangerous situation to be in, when in reality it's a great opportunity to establish yourself as an expert and demonstrate you considerable knowledge with well-formed arguments and supporting data/examples. Sometimes the client just isn't quite getting it and it gets a bit adversarial.

I have spent a good deal of time coaching friends, employees, partners, and so on (designers, developers, architects, etc.) on the best ways to deal with clients who have trundled down the wrong path and need some correction. Usually these people feel a great deal of trepidation in realigning the client for fear of losing the business or, worse, being overridden and forced to create something with which they don't agree (but will bear their names).

Conveniently, I don't have to reiterate all the options and steps you can take. Somebody has done a pretty good job of outlining them for me. You can read the full article, How To Explain To Clients That They Are Wrong, to get all the details. You can also read a little behind-the-scenes at the author's blog. It may be worth keeping this client perspective in mind, as mentioned in the article:

...[M]any clients still regard creative digital agencies and freelancers as either kids living in their parents' basement or shady professionals out to take them for every last penny.

For those of you who just want the distilled version, here you go, with some of my own tips peppered within...

First, determine if the client is even wrong or this is just a knee-jerk reaction on your part. As freelancers, it's easy to let your ego get the best of you. Remember, the client knows his/her business, you know the technology or design rules.

Next, speak the client's language. Ask the client what the business benefit is of the request. Don't try to snow the client with techno-babble or designer-speak. If you can get the client to verbalize the business goal, you are off to a good start. If the client cannot verbalize it, perhaps the client will realize that the request is couched in vanity instead of a tangible reason.

As part of all this make sure you come off as the expert. Dress well. Speak well. Spell well. Brand yourself well. Grammarify well. Make up new words well. Be confident, support your opinions with examples and facts, and be prepared to offer alternatives (hybrid solutions even). And don't be late (to meetings, on deadlines, to bed).

Don't hide from the client or the issue, address it quickly, in person (or perhaps on the phone, but not via email) and with supporting documentation (sign-off letters, email verification). I am a fan of the direct approach. Like a Band-Aid, just tear it off, it will be over more quickly. The article I am referencing is a little less aggressive about being direct, but if you are honest and humble (add some humor) then you should be fine.

If the client is insistent, you may need to back down. The client is the one paying, after all, and if you can document that you have attempted to prevent the client from shooting his/herself in the foot, then you will be fine. Consider making the client sign off that he/she is going against your recommendation. If that's too aggressive, just send an email verification. If you are familiar with A/B testing, now is a great time to propose it. If you aren't, you should go buy a book. In the end, spend some time looking at the results of the change to see if it ended up being more effective.

If you've gotten this far, then you should also go read the comments at the original article. There are some good ones in there, sprinkled among the self-aggrandizing ones.

Sunday, December 13, 2009

How Many Disabled Users?

There is an article over at Practical Ecommerce titled Accessibility: How Many Disabled Web Users Are There? by Joe Dolson. It is refreshing to see more traditional sites dealing with accessibility, especially when it can so significantly affect their bottom line. As an indication that the author gets it:

I often hear business owners claim that their sites aren't used by people with disabilities, so they don't need to pay attention to web accessibility. But there's no basis for such claims because the merchant can't possibly know this information. The tracked profile of a user with a disability, via a typical analytics package, is identical to anybody else using that browser.

The author is smart enough to discard data that doesn't necessarily impact web usage — such as leg amputations, for example. While presenting the information below, the author reminds us that the Baby Boomers are getting older, and crossing the 65-year-old age barrier (see below):

The most commonly discussed disabilities affecting website accessibility are sight and hearing impairments. These specific impairments encompass 6.8 percent of the population age 15 years and older — and climb to encompass 21.3 percent of the population when you look specifically at the population over 65, according to the 2005 report. Eight-point-two percent of this same population is listed as having difficulty grasping objects — which affects the use of a mouse.

A conservative estimate says 1 out of 10 of your users could have an impairment of some sort. For every day that passes, that's another user who has aged or deteriorated in some way, making that number climb over time. That's 15.5 million potential customers. Perhaps this is a better way to present the information to your ecommerce customers.

Follow-up Information

Just three days later the author posted a follow-up article on his own blog titled United States disability statistics: Measurement and sources. The author explains that while his official article still gets to the point about supporting disabled users, this post is intended to provide more detail on the numbers. The authors closing comments explain how the data is good enough for some general numbers, but lacks a little detail on specific issues:

In general, my assumption is that the data may include some individuals who struggle with reading due to dyslexia, dependent on the exact phrasing of the questions, but not all, and presumably includes no or very few individuals with color blindness.

The author (still Joe) is kind enough to provide links to the PDF versions of the U.S. Census Bureau reports he used as his data sources. Since I am also a fan of providing links to the raw data, here you go:

Friday, December 11, 2009

Tables as Consumed by JAWS

There is an interesting article over at the WebAIM blog titled JAWS Ate My Tables.

The article describes how JAWS (version 10 in this case), a screen reader, decides whether an HTML table is used for layout purposes or as a data table. It turns out that JAWS does not lean on the th element or other markup clues. JAWS will look for the attribute DataTable=true|1 as an indicator that the table is used to hold data, but this is invalid HTML and certainly isn't supported by me or any WYSIWYG editors in use in CMS editors.

Instead, JAWS looks for the presence of at least 2 rows and 2 columns AND at least 4 cells in the table are between 200 and 16000 square pixels in size to determine if a table is a data table. This means JAWS ignores other markup used for accessibility or indication of the table's purpose. JAWS also analyzes the rendered size of the table, so a data table that has been enlarged by a low-vision user could be treated as layout only.

ARIA does allow authors to specify a table for presentation only (role="presentation"), but this doesn't work to force a table to be treated as data only. I suggest you check out the post and some of the user comments, and then accept that JAWS just isn't handling tables properly.

Thursday, December 10, 2009

Video Accessible to Keyboard Users

Trenton Moss over at Webcredible has posted an article, Accessible online video for keyboard-only users. The concepts within are very simple, but require developers to take an extra step or two, which may account for why we see so few sites with these features implemented.

One key issue is that developers must not only code client-side script for mouse interaction, but also for keyboard interaction. Since IE behaves differently than other browsers, it must also be written twice (or at least forked). Another factor is the reliance on Flash players, which sometimes won't let you tab into the control from the rest of the page (again, depending on the browser). Regardless of whether the control is in Flash or just HTML with script, then the developer must provide a focus state for each control (so users can tell which is selected) and use a logical tab order.

The author gives the example of making the video slider accessible to keyboard users, perhaps allowing users to directly type in a time stamp so the video will jump to that point. This also benefits users who can rely on a mouse by providing a quick way to jump to a known point in the video. Volume sliders would benefit from this sort of update as well.

Wednesday, December 9, 2009

Bulletproof @font-face Syntax (reprint)

Paul Irish has gone ahead and created a block of CSS that we can reliably embed into our pages that will import .eot and .ttf/.otf font files. In his article Bulletproof @font-face syntax, he breaks down the various options and their support, providing arguments for and against each. In the end, he provides what he considers the best method to declare your @font-face styles in your CSS:

@font-face {
  font-family: 'Graublau Web';
  src: url('GraublauWeb.eot');
  src: local('Graublau Web Regular'), local('Graublau Web'), 
         url('GraublauWeb.otf') format('opentype');
}

If you can support (generate) WOFF or SVG typefaces, then he provides a slightly expanded block of code that can support Chrome 4 and Firefox 3.6 (neither of which has been released yet):

@font-face {
  font-family: 'Graublau Web';
  src: url('GraublauWeb.eot');
  src: local('Graublau Web Regular'), local('Graublau Web'),
    url("GraublauWeb.woff") format("woff"),
    url("GraublauWeb.otf") format("opentype"),
    url("GraublauWeb.svg#grablau") format("svg");
}

Tuesday, December 8, 2009

10 (Obvious) Usability Crimes

Having stumbled across the article "10 Usability Crimes You Really Shouldn't Commit, I can see that the suggestions are pretty obvious, and the number 10 is probably more arbitrary than based on some natural break in severity. However, there are some things in the article I have been repeating for years that people just don't get. And this article uses He-Man graphics, which makes it even cooler.

I'll list the 10 items, but they are much cooler in context and with the He-Man photos, so go there anyway.

He-Man image showing the 'alt' attribute.

  1. Form labels that aren't associated to form input fields;
  2. A logo that doesn't link to the homepage;
  3. Not specifying a visited link state;
  4. Not indicating an active form field;
  5. An image without an alt description;
  6. A background image without a background color;
  7. Using long boring passages of content;
  8. Underlining stuff that isn't a link;
  9. Telling people to click here;
  10. Using justified text.

Friday, December 4, 2009

24 Ways Is Back Over 24 Days

If you were paying attention any of the last few years, you may have noticed that the 24 Ways web site is set up to run as an annual advent calendar for web geeks. Each day the site posts a new article dealing with some aspect of the web, ideally giving something useful to everyone who might be reading it.

You can find daily updates right on the home page, or by following them on Twitter (@24ways) or following the RSS feed in your favorite aggregator (full content of each article is in the feed).

Authors you should recognize will be posting throughout the month (Jeff Zeldman, Eric Meyer, John Allsopp, Jeremy Keith, Christian Heilmann). You can also look back at articles from previous years (2005, 2006, 2007, 2008) to see how many of those old tips still hold true and how many might have changed over time and use.

Because I lost a few days at the beginning of this month, I'll use this post instead to highlight the first four days of articles. It changes daily, so be sure to go check it out each day for the rest of the month.

Working With RGBA Colour
Drew McLellan kicks off the2009 season with a look at some of the tools CSS3 provides for applying levels of transparency to color values, enabling you to avoid weighing down a site design with heavy PNG images.
Breaking Out The Edges of the Browser
Remy Sharp takes us by the hand and guides us through our first steps into the web applications side of HTML5 with a look at web storage and offline applications. You'll need a nice modern browser and some Kendal Mint Cake.
Have a Field Day with HTML5 Forms
Inayaili de León introduces some of the new form field types available in HTML5, and then goes on to look at some more advanced CSS3 techniques which can be used to keep your forms looking sharp and ship shape.
What makes a website successful? It might not be what you expect!
Paul Boag challenges us to think about what makes sites successful, which has interesting implications on how resources are spent.

Wednesday, November 25, 2009

Enjoying Thanksgiving with Social Media

A lot can be said about the value of social media, with arguments for real business value or ways to stay connected with friends and family or even that most of it is just egocentric drivel. As one of the purveyors of egocentric drivel in my Twitter stream, I can understand that it's not for everybody. I did find a way to garner at least some faint interest in naysayers, however.

Last year I hosted Thanksgiving dinner at my pocket-sized house and practically counter-free kitchen for a dozen people. Since I had been in the kitchen almost non-stop since the prior afternoon I wasn't in much of a position to entertain my guests, but I did have something they found fascinating. I had just discovered the Brightkite Wall.

For those who don't know, Brightkite is a microblogging service, like Twitter, that also has built-in support for photos and geolocation, allowing you to "check in" to a location and post messages and images about the place (or event, etc.). It predates Foursquare, but does not use the game model at all and allows you to check in from anywhere in the world, not a restricted list of pre-defined cities.


Watching the Brightkite wall, waiting for guests.

The Brightkite Wall essentially turns your display into a simple electronic billboard, showing a stream of posts (text and images) as they come through the service. Last year I fed this directly to my television and let my family watch people comment and post photos throughout the day, watching shot after shot of peoples' meals, kitchens, families, turkey failures, plate mishaps, and comments about naps. It seemed a little voyeuristic, but it was also a great way to experience Thanksgiving across the country as a whole, feeling some sort of connection with people I've never met. We watched meals ebb and flow with the timezones, people try to juggle more than one stop, and many missives about things for which people were thankful (and more than a few for what they were not thankful). It became quite an interactive affair in my house as everyone commented on their favorite images or updates and as they pressed me to post photos of our meal.


Some guests before dinner.


I finally get to eat.

This year I am not hosting, but I am bringing my laptop so, just in case anyone remembers, we can fire it up and watch this little slice of Americana play itself out throughout the day.

If you want to try this yourself, I have some configuration suggestions. First of all, you don't have to have a Brightkite account to use the wall. I recommend creating a Universe Stream instead of limiting it to one geographic area, one person, or one search term. Make sure you disable check-ins. You don't need to see that some guy named Ed checked in at 1313 Mockingbird Lane, you just want to see the comments and photos. If you want to see Twitter posts, you can enter Twitter search terms. These will help filter the tweets that get folded into the stream. You may need to adjust settings when you do it — when I ran this last year I didn't worry about Twitter spam or vulgarities.


Click for a full-size view of the configuration screen.

This little experiment made it much easier to explain what social media is and how it works, and I think it made for a richer Thanksgiving. I do recommend a big enough display that people can see from across the room, since nobody who's overeaten on Thanksgiving wants to be crowded by others around a laptop.

Enjoy your Thanksgiving and I look forward to your Brighkite posts (and tweets).

Tuesday, November 24, 2009

4 Principles of Mobile UX Design

Boxes and Arrows has an article titled "Four Key Principles of Mobile User Experience Design" written by a former academic mobile UX (User eXperience) researcher. As the author transitioned to private sector he felt that when mobile UX was discussed it was too driven by the gee-whiz factors and not practical principles of mobile user experience. He authors these four principles as a result, which I am summarizing here.

1: There is an intimate relationship between a user and his/her mobile device.

The example the author cites is loaning your phone to someone on a hot, sticky day. Most of us are uncomfortable letting someone fiddle with our phones — partly because of personal data on the phone, and partly because we are so physically tied to our phones we don't want others to soil them.

2: Screen size implies a user's state. The user's state infers his/her commitment to what is on the screen.

The author argues that the declining screen real estate between movie screens, TVs, computers and, ultimately, mobile phones corresponds to the commitment the user has to watch a movie. The real point I take from this that it is far easier to abandon a non-functioning site when on a mobile device, when your attention is already probably minimal, than it would be if you were using a full computer, with the ability adjust a bad experience through browser features and so on.

3: Mobile interfaces are truncated. Other interfaces are not.

Mobile phones themselves do not offer the full array of input options as a computer does. A small QWERTY keyboard (at best), touch screen, and maybe some accelerometers are a far cry from a 12-key number pad, but they don't offer all the options that a desktop computer offers with a mouse, multiple document interface, accelerator/modifier keys and so on. Expecting users to casually enter as much data on the mobile device as they would on their desktop computer is a bad starting point. This has always made me wonder why the .mobi TLD was approved when it has one more character than .com.

4: Design for mobile platforms — the real ones.

The author reminds us that there are four components to mobile devices: Voice, messaging, internet, and applications. It's common for the industry to get caught up in manufacturer-specific features and forget the core of the platform.

Read Up!

There are some good comments on the article furthering the discussion of mobile as a platform beyond just web browsing.

Friday, November 20, 2009

YouTube Will Automatically Caption Your Video

Three years ago YouTube/Google added the ability for video authors to add captions to videos. Over time support for multiple caption tracks was included, the expansion of search to consider text in captions, and even machine translation support for the captions (see my other post about machine translation risks).

Even with hundreds of thousands of captioned videos on YouTube, new videos are posted at the rate of 20 hours of video per minute. For many companies (and not-for-profits and government agencies), YouTube provides the most cost-effective and ubiquitous method to distribute video content to users. So many of these organizations (particularly not-for-profits and government agencies) are required by law (US and elsewhere) to provide captions for video, but don't have the experience or tools to do so. Users who are deaf are excluded from fully understanding this content as a result.

This is where the speech recognition features (ASR) of Google Voice come into play. This technology can parse the audio track of your videos and create captions automatically. Much like machine translation, the quality of these captions may not be the best, but it can at least provide enough information for a user who could not otherwise understand the video at all to glean some meaning and value.

In addition, Google is launching "automatic caption timing," essentially allowing authors to easily make captions using a text file. As the video creator, an author will be able to create a text file with all the words in the video and Google's speech recognition software will figure out where those words are spoken and take care of the timing. This technique can greatly increase the quality of captions on videos with very little effort (or cash outlay for tools) on the part of the video creator.

You can read more at the the YouTube Help Center article. You can also read the blog post announcing this feature at the Google Blog. The video below shows a short demo about the auto-captioning and auto-timing features.

Update (August 25, 2010): Paul Bukhovko of FatCow was kind enough to translate this entry into Belorussian: YouTube аўтаматычна захоплівае сваё відэа

Wednesday, November 18, 2009

IE9 First Details

Microsoft revealed some first details of Internet Explorer 9 at the Microsoft Professional Developer's Conference, as reported by Mashable today. Only in development for three weeks, there's still quite a lot of time before it gets to market. According to Mashable, Microsoft did have the following to say:

  • On HTML 5: Microsoft was coy about whether it would support all of the HTML 5 standards, the next generation of HTML. The company doesn’t seem willing to commit to the standard until it is set in stone, but “wants to be responsible” about supporting it.
  • On Javscript: They admit that their previous browsers don’t match the speed of Firefox or Chrome. However, it appears that IE9 looks to narrow this gap. From some of the data they presented, it looks like they’re getting closer to matching the other browsers (though they don’t beat them).
  • On CSS Support: It looks like IE9 will finally get better CSS support, especially for rounded corners. It’s a disappointment though, when you consider the other browsers have supported these things for years.
  • On Hardware Acceleration: IE9 will utilize DirectX hardware acceleration to improve graphic and AJAX rendering. It will push more work towards the GPU. This is actually looks pretty slick from first appearances.

While I can understand Microsoft's position that HTML5 is not set and therefore may not support everything in the barely-draft spec, some of the elements seem pretty well locked in with only minor syntax and rendering issues left to suss out. To that point, I hope Microsoft can at least work in that support. The CSS support is a whole different story. Given how long the CSS2 spec has been out there (since 1996), it would be nice if they'd commit to fully supporting it, even if they aren't yet sure about CSS3 support.

As Internet Explorer's market share is slowly eroded by Firefox, Safari and Chrome (on a trend that, if projected as a simple linear graph, would see IE go away by 2021), Microsoft is motivated to increase the overall performance of its next browser. Unfortunately, given the slow pace at which IE version 8 is being adopted over older versions (still at 34.1% of all IE installations after release March 19, 2009, versus IE7 at 37.6% after release October 2006 and IE6 at 28.3% from way back in August 2001), it is quite likely that even after IE9 is released it may be years before developers can rely on its features on public-facing web sites.

Monday, November 9, 2009

Screen Reader User Survey Results

WebAIM is a non-profit organization within the Center for Persons with Disabilities at Utah State University that focuses on accessible web content and technologies. WebAIM conducted a survey of the preferences of screen reader users back in December 2008, gathering a lot of interesting data about how users utilize assistive technologies (you can see the results of that survey at the WebAIM site).

WebAIM conducted another survey in October to track preferences of screen reader users. They received 665 responses to the survey consisting of a mix of disabled (90%) and abled users (10%). It's not a truly scientific survey, but it provides some valuable insight into usage patterns and user expectations.

I've just posted an article, WebAIM Screen Reader User Survey Results outlining the results of the survey. A couple excerpts:

Mobile

Pie chart of mobile screen reader use.

Most surprising to me was that 53% of those with disabilities claim they use a screen reader on a mobile device. More proficient screen reader users were more likely to use a mobile screen reader. If developers already struggle with building sites for mobile devices or struggle with building sites to be accessible, this can seem like a difficult challenge for many. The survey doesn't gather other information on mobile use, perhaps because they were surprised by its prevalence as well.

Finding Information

Users were asked how they go about finding information on a lengthy web page. 50.8% of users indicated they they use the page headings to navigate (really bolstering the argument of using proper headings in your content). 22.9% use the "find" feature of the browser, 16.1% navigate the links on the page, and 10.1% read through the page (and are apparently far more patient than I).

Go read the rest of the article. Now. Go.

Thursday, November 5, 2009

Google Dashboard: What Google Knows about You

Google announced a new service/feature today, Google Dashboard. Given all the services Google offers and all the ways you can interact with Google, it's not surprising many people have privacy concerns and conspiracy theories (do enough people watch The Simpson's for me to make an MLB joke here?). Google announced it in a blog post today (Transparency, choice and control — now complete with a Dashboard!) and included a handy video to walk users through the process of accessing Google Dashboard. Dashboard essentially offers a one-stop view of all the data in your Google account across 20 of its services, including Gmail, Calendar, Docs, Web History, Orkut, YouTube, Picasa, Talk, Reader, Alerts, and Latitude. If you are interested in privacy policies for each of Google's services, head over to their Privacy Center.

Friday, October 30, 2009

Internet Turns 40, Just Might Catch On

Media outlets seem to have settled on October 29 as the official birthday of the Internet. This date has been chosen because it's the day that Leonard Kleinrock at the University of California-Los Angeles sent a message over a two-computer network (the other end being a computer at Stanford Research Institute) with Charley Kline manning the UCLA keyboard and Bill Duvall on the Stanford site. It's worth noting that the computer carrying the first ever transmission on the Internet ("LOGIN") crashed after only two letters ("LO"). I believe that Kline actually typed an "L" for the third letter (instead of "G") and in a fit of future-sensing self-sacrifice, executed a core dump all over the floor.

Some may point out that on September 2, 1969, two computers were connected with a 15-foot cable and passed data back and forth. That was a precursor to the networking that happened a month later, but is not generally regarded as the birth of the Internet. Just as neither the first email message (1971) nor the first web browser (1993) are considered the birth of the Internet.

Given this historic day, there has been a lot of media coverage (some of it pretty bad, just like the average YouTube video) detailing some of the steps or milestones of the last 40 years. Some of the crunchy bits:

The opening image of this post is an Internet timeline (in extra large format so you can read it from the other room, or across the street) from Daily News LA article "How the Internet was born at UCLA."

Videos and Audio Bits

The All Things Considered broadcast:

A somewhat technical perspective of the time leading up to and after the birth of the Internet:

A video from 1993 by the CBC covering the "growing phenomenon of Internet" (covering mostly just Usenet):

The Web

For those who don't quite understand the relationship between the web and the Internet as a whole, the World Wide Web came much later. First as a proposal to CERN by Tim Berners-Lee in March of 1989 and then in the form of NCSA Mosaic in April of 1993 (yes, it was not the first web browser, but it was the first to get traction).

To qualify that a bit more, if anyone comes to you claiming 25 years of web experience (as one follower on Twitter recently did), you can send them away. The web is barely old enough to drive.

Update: There are no Al Gore jokes in here. This was intentional. Srsly. Then I'd have to link to a photo of Al Gore and nobody wants that.

Thursday, October 29, 2009

Reminder: See Me Speak, Tues. Nov. 3

I will be one of the panelists at the Infotech Niagara panel session titled "Maximizing Your Web Presence." It takes place Tuesday, November 3, 2009 at 8:00am at Buffalo Niagara Partnership's conference room, 665 Main Street, Suite 200, Buffalo, New York 14203 (map below). BNP has parking information at their web site.

From Infotech Niagara:

Ok, you have a website, now what?

Join infoTech Niagara for a panel discussion on "Maximizing Your Web Presence." Our panelists bring years of experience in web strategy, web design, search engine optimization, social media, web video and more.

Come learn from the experts what you can do to leverage new and existing technologies to maximize the effectiveness of your web presence.

Panelists include:

  • Adrian Roselli, Senior Usability Engineer, Algonquin Studios
  • Brett Burnsworth, President, Zoodle Marketing
  • Jason Holler, President, Holler Media Productions
  • Mike Brennan, Vice President, Noobis, Inc.

Continental Breakfast will be provided.

Cost:
ITN Members: $10
Non-Members: $15
Students: $5

Register online.

View Larger Map

Wednesday, October 28, 2009

Google CEO Describes Web in 5 Years

ReadWriteWeb posted an article (Google's Eric Schmidt on What the Web Will Look Like in 5 Years) highlighting some bits from Eric Schmidt's (Google CEO) talk at the Gartner Symposium/ITXpo Orlando 2009. ReadWriteWeb was even kind enough to post a 6 minute excerpt that they believe would be of interest to those of us on the web:

He has some specific ideas that stand out:

  • Citing Moore's Law and claiming that 5 years is a factor of ten increase in computing power, he sees far more powerful computers and broadband in excess of 100MB. This feeds his idea that TV, radio, the web (YouTube, etc.) will all become merged in some way.
  • The internet will be dominated by content in Chinese (although he's silent on the future of Great Firewall of China).
  • Real-time content and real-time search may further push traditional news sources to a tier below user-generated content.
  • Watch today's teenagers to see how they use the web today as they will drive how employees consume content in 5 years.

Catch the full 45 minute interview on YouTube: Eric Schmidt, CEO of Google, interviewed by Gartner analysts Whit Andrews and Hung LeHong in front of 5,000 CIOs and IT Directors at Gartner Symposium/ITxpo Orlando 2009.

Tuesday, October 27, 2009

New Google Analytics Features

In the article "Google Analytics Now More Powerful, Flexible and Intelligent" from last Tuesday (yes, I know I'm behind on this) on the Google Analytics Blog, the Analytics team has introduced some interesting new features. Some of the updates:

  • Two new goal types allow you to set thresholds for Time on Site and Pages per Visit. You can also define up to 20 goals per profile.
  • A server-side chunk of code is coming to allow tracking on moble sites (since mobile browsers may not be able to run the JavaScript Analytics code on a page).
  • Advanced table filtering allows you to filter rows based on assorted conditions. In their sample video they show ho to filter keywords to identify just the keywords with a bounce rate less than 30% that referred at least 25 visits (it's much easier to understand with this video).
  • You can now select unique visitors as a metric for custom reports.
  • You can now have multiple custom variables so you can track visitors according to visitor attributes, session attributes and page attributes.
  • You can now share custom reports and segments with other Analytics users.

There are two items over which the blog post is most excited:

Analytics Intelligence: We're launching the initial phase of an algorithmic driven Intelligence engine to Google Analytics. Analytics Intelligence will provide automatic alerts of significant changes in the data patterns of your site metrics and dimensions over daily, weekly and monthly periods. For instance, Intelligence could call out a 300% surge in visits from YouTube referrals last Tuesday or let you know bounce rates of visitors from Virginia dropped by 70% two weeks ago. Instead of you having to monitor reports and comb through data, Analytics Intelligence alerts you to the most significant information to pay attention to, saving you time and surfacing traffic insights that could affect your business. Now, you can spend your time actually taking action, instead of trying to figure out what needs to be done.
Custom Alerts make it possible for you to tell Google Analytics what to watch for. You can set daily, weekly, and monthly triggers on different dimensions & metrics, and be notified by email or right in the user interface when the changes actually occur. [....]

Monday, October 26, 2009

R.I.P. Geocities

In my post "Wait - GeoCities Still Exists?" I mentioned that on October 26 Geocities was going away. Well, that sad day is upon us. And if you didn't follow those Geocities links I posted, you are SOL now. However, more tributes have popped up on the web in honor of this historic (hysterical?) day.

Friday, October 23, 2009

Usability Testing vs. Expert Reviews

An article at UX Matters titled "Usability Testing Versus Expert Reviews" takes a reader question and tosses it to a series of experts to answer:

Under what circumstances is it more appropriate to do usability testing versus an expert review? What are the benefits and weaknesses of each method that make one or the other more appropriate in different situations?

The experts ultimately all came up with similar answers — do both. Start with the expert review to take care of low-hanging fruit and then bring users in for the testing phase to catch the issues that trip them up. Some quotes:

Expert reviews are especially useful for finding violations of usability standards and best practices. These are often obvious problems that may or may not cause problems during usability testing.
Before doing usability testing, it is helpful to do at least an informal expert review to determine what to focus on during testing.
I recommend always doing both. Expert reviews that are performed by specialists, using standards and heuristics, reveal easy-to-catch usability problems in a very cost-efficient way.
While usability testing is more powerful than expert review, both methods in combination are great, because you first want to discover the low-hanging fruit and get them out of the way.
We need to remember that expert review is a user-free method. Regardless of the evaluators' skill and experience, they remain surrogate users—expert evaluators who emulate users—and not typical users. The results of expert review are not actual, primary user data and should lead to—not replace—user research.

There aren't any real surprises here, but it's interesting to see the different approaches suggested by each expert.

Thursday, October 22, 2009

Bing and Google Add Social Search

Google and Bing have been locked in a struggle recently for search engine dominance. Bing came out of the gates fast and gained a lot of market share, but has appeared to level off recently (another link, and another link). Neither of them wants to lose any ground. Factor in the recent explosion of Twitter and other near-real-time social media outlets on the web and people's desire to search them all, and you have two search giants salivating over new opportunity.

Both Microsoft and Google announced partnerships with Twitter yesterday. There had been fears of one making an investment in Twitter and locking the other out from search results, but those fears appear to have been assuaged. At least for now. Consider that Microsoft already has a sizable cash investment in Facebook ($240 million, or 1.6% of Facebook's valuation at the time) giving them a leg up over Google on searching within Facebook. It seemed that the same thing might happen with Twitter (a company that just closed a deal for another $100 million in funding, pushing its value to ~$1 billion).

Google made its announcement at the Web 2.0 Expo, showing off its Social Search feature from Google Labs. The same presenter, Google's VP of Search Products and User Experience, then posted this to the Google blog:

...[W]e are very excited to announce that we have reached an agreement with Twitter to include their updates in our search results. We believe that our search results and user experience will greatly benefit from the inclusion of this up-to-the-minute data, and we look forward to having a product that showcases how tweets can make search better in the coming months.

Mashable has some more detail on what was discussed at the in-person announcement, including some information on linking your social media profiles in to Google to allow searching across them as well

Microsoft signed two deals on this same day, one with Twitter to bring its tweets to Bing, and one with Facebook to bring status updates to the search engine (another link). The Bing blog reports its Twitter search:

...[T]oday at Web 2.0 we announced that working with those clever birds over at Twitter, we now have access to the entire public Twitter feed and have a beta of Bing Twitter search for you to play with (in the US, for now).

Essentially the Twitter search has been rebuilt within Bing, including the real-time updates. Bing also includes relevancy, based on things like number of retweets, keywords and the quality of the tweets (who decides this?). Bing also shows the trending topics as a tag cloud. In case you haven't tried it yet, head on over to Bing.com/twitter

The next few months should be interesting as both Bing and Google tweak and enhance they new offerings. Given their same-day announcements, we can expect both teams to be watching the other closely and responding to changes quickly.

In the meantime I'll be scratching me head on why Twitter, a place solely for me and millions of others to spout our own brand of crazy in real-time, is valued at $1 billion.

UPDATE: Read this swell article, just posted: Social Search: Customers Influence Search Results Over Brands.

Wednesday, October 21, 2009

Firefox 3.6 to Support Web Open Font Format

Mozilla's developer blog today posted that they have added support for the Web Open Font Format (WOFF) to Firefox 3.6. Firefox 3.5 gave us support for linking to TrueType and OpenType fonts, but this takes it a step further to support a format that is more robust for two key reasons:

  1. WOFF is a compressed format, resulting in files smaller than an equivalent TrueType or OpenType font.
  2. It avoids DRM and domain labels by containing meta data about its source, garnering support from typeface designers and foundries.

If you want to see this in action, you'll need to grab a Firefox 3.6 nightly build until the full release is out the door. If you should feel compelled to do that, the nice folks at Mozilla have provided a sample block of CSS that uses the @font-face rule to link to a WOFF font:

@font-face {
  font-family: GentiumTest;
  src: url(fonts/GenR102.woff) format("woff"),
       url(fonts/GenR102.ttf) format("truetype");
}

body {
  font-family: GentiumTest, Times, Times New Roman, serif;
}
Structured this way, browsers that support the WOFF format will download the WOFF file. Other browsers that support @font-face but don’t yet support the WOFF format will use the TrueType version. (Note: IE support is a bit trickier, as discussed below). As WOFF is adopted more widely the need to include links to multiple font formats will diminish.

If you are fortunate enough to have a build of Firefox 3.6 already up and running on your machine, go to a test page using ff Meta set up by the nice folks at edenspiekermann (if part of the name is familiar, Erik Spiekerman is the designer of the Meta family, and if the typeface is familiar, it's what we use at Algonquin Studios). The image below shows how that page looks in Firefox 3.6 using ff Meta (left side) and how it looks rendered in Internet Explorer 8 (right side).

Screen shot showing page with ff Meta typeface on one half, not on the other.

Because IE8 only supports the EOT format, the blog offers some code to account for IE8 in the CSS. Because IE8 doesn't understand the format hints, it will parse the hints as part of the URL, resulting in requests to the server for files that don't exist. The end user will see things just fine because of the EOT reference, but your logs will show some odd 404s as a result of this technique. The Mozilla post has more details on this and some other issues. The code to do this:

@font-face {
  font-family: GentiumTest;
  src: url(fonts/GenR102.eot);  /* for IE */
}
 
@font-face {
  font-family: GentiumTest;
  /* Works only in WOFF-enabled browsers */
  src: url(fonts/GenR102.woff) format("woff"); 
}

The main Mozilla blog has a post today listing the supporting organizations with the following endorsement:

We endorse the WOFF specification, with default same-origin loading restrictions, as a Web font format, and expect to license fonts for Web use in this format.

Updates

Tuesday, October 20, 2009

"Myth of Usability Testing" at ALA

There is a very good article over at A List Apart today titled "The Myth of Usability Testing." The article starts off with an example of how multiple usability evaluation teams, given the same task and allowed to run at it as they saw fit, had far less overlap in found issues than one would hope.

The author goes on to explain why usability evaluation is unreliable with a series of examples (which seem painfully obvious, and yet these mistakes keep happening) broken into two main categories:

  • "Right questions, wrong people, and vice versa."
    Using existing users to evaluate a site is a loaded approach, they already have expectations set by the current site that taint their ability to see other options. Conversely, asking new users to complete tasks driven by an existing design is not a good way to evaluate new approaches.
  • "Testing and evaluation is useless without context."
    It is common for me to hear the statement that "nothing can be more than two clicks from the home page," but this ignores the real context of the site and its users. These blanket statements or goals can harm an evaluation when instead a test should start with an understanding of user goals, success metrics, and real site goals.

From here the article outlines what usability testing is actually good for, and then helps focus the reader on the reality of the testing process and its results. I'm glossing over the other 2/3 of the article partly because I wanted to draw attention to the bits above and partly because you should just go read it already. There are some good links in the article for tools that can help identify trouble spots and support an evaluation.

Sunday, October 18, 2009

Current CSS3, HTML5 Support

The Tool

Last week saw the launch of FindMeByIp.com, a very handy web site that displays a user's current IP address (along with a geographic breakdown to city, if possible), user agent string (browser, version and operating system) and support for CSS3 and HTML5 (read the article about it). It accomplishes this last bit by using the Modernizr JavaScript library to test a series of features in your browser:

  • @font-face
  • Canvas
  • Canvas Text
  • HTML5 Audio
  • HTML5 Video
  • rgba()
  • hsla()
  • border-image
  • border-radius
  • box-shadow
  • Multiple backgrounds
  • opacity
  • CSS Animations
  • CSS Columns
  • CSS Gradients
  • CSS Reflections
  • CSS 2D Transforms
  • CSS 3D Transforms
  • CSS Transitions
  • Geolocation API

If you are running the Google Chrome for Internet Explorer plug-in, it will show up in your browser user agent string on this page. However, it will also report your Internet Explorer browser as Chrome.

The Results

The site developers ran their own browsers through this tool and that collection of information has been gathered up and posted to provide information on current browser support for the latest standards (or recommendations). Deep Blue Sky, one of the developers of this site, has written up the level of support in all the A-Grade browsers. Yahoo's model outlines A-Grade browsers as "... identified, capable, modern and common" and claims "approximately 96% of our audience enjoys an A-grade experience." This is their codified way of simply saying, "the latest common browsers." If you read my post, Browser Performance Chart, then you'll see the same browsers. The article, Browser support for CSS3 and HTML5, covers the following browsers (the author didn't test against anything other than Windows versions):

  • Safari 4 (Win)
  • Firefox 3.5 (Win)
  • Google Chrome (Win)
  • Opera 10 (Win)
  • Internet Explorer 6, 7 & 8

The surprise in here is that Safari 4 outshone (outshined? outshinerified?) the others with Google Chrome coming up close behind. Firefox 3.5 was missing some key CSS3 support and Opera 10 was surprisingly lacking in support. As expected, Internet Explorer brings up the rear, lacking in everything but @font-face support (even in IE8) which has actually been there since version 6 (but only for the .eot format).

Saturday, October 17, 2009

Personas in Comic Format

For developers, and clients, struggling with the concept of personas, there is a very easy to read primer in the form of a comic over at the ThinkVitamin blog in an article titled "How to Understand Your Users with Personas."

The concept of personas was first introduced in the book The Inmates Are Running the Asylum (1998) as tool for interaction design. In short (very short), personas are intended to help you (or your client) understand the needs of your users — their goals, experience, objectives, perspectives, etc. If I tell you too much, the comic won't really be worth reading, so go read it already.


It's like they're in my head.

Friday, October 16, 2009

Browser Performance Chart

Jacob Gube has posted a handy chart over at Six Revisions titled "Performance Comparison of Major Web Browsers." He tests the current versions of five browsers:

  • Mozilla Firefox 3.5
  • Google Chrome 3.0
  • Microsoft Internet Explorer 8.0
  • Opera 10.0
  • Apple Safari 4.0

In his tests he used the following performance indicators, tested three times each with an unprimed cache, and averaged the results:

  • JavaScript speed
  • average CPU usage under stress
  • DOM selection
  • CSS rendering speed
  • page load time
  • browser cache performance

When Google Chrome 3 was released in September, Google claimed a 150% increase in JavaScript performance from its previous version but didn't offer comparisons to other browsers. Opera also made improvement claims in its September release of Opera version 10, specifically a "40% faster engine." Those claims are also not made in comparison to other browsers. The overall performance ranking he assigns to the browsers isn't too much of a surprise, with Google Chrome as the best and Internet Explorer 8 as the worst. If he ran as many persistent tabs in Firefox as I do, it might not hold on to the #2 slot so easily. The ranking:

  1. Google Chrome 3.0
  2. Mozilla Firefox 3.5 (tied with Safari)
  3. Safari (tied with Firefox)
  4. Apple Safari 4.0
  5. Microsoft Internet Explorer 8.0

Use the chart below to link to the full size version. You can also download the raw data in CSV format from the site in case you want to try your own hand at colorful charts.

Small view of browser performance comparison chart.

Thursday, October 15, 2009

Developer Discusses Dyslexia and Dyscalculia

Sabrina Dent, a web designer hailing from Ireland, has blogged about her struggle with dyslexia and dyscalculia and web applications today in the post, "Dyslexia, Dyscalculia and Design". For some context, she links to the Wikipedia article on dyscalculia and highlights the bits that apply to her:

  • An inability to read a sequence of numbers, or transposing them when repeated, such as turning 56 into 65.
  • Problems with differentiating between left and right.
  • Difficulty with everyday tasks like checking change and reading analogue clocks.

Sabrina discusses her experience with the examples of login screens (specifically the steps that require more detailed information than a username and password), phone numbers, and booking calendars.

It's a brief post, but it's insightful. As a web designer she understands the motivation for these types of interfaces, but that doesn't mean they are easy for her to use.

Wednesday, October 14, 2009

Derek Powazek on SEO as Snake Oil

There are many on the web who will recognize the name Derek Powazek. He is the name behind old-school sites such as Fray.com and Kvetch.com (which has apparently been taken over by spam bots) and wrote a book about communities (Design for Community, which mentions me by name, which is awesome). I also had the pleasure to meet him at SXSW back in 2001 and even participate in his Fray Cafe. So when I saw his blog post on SEO that started off with this statement, I was pleased:

Search Engine Optimization is not a legitimate form of marketing. It should not be undertaken by people with brains or souls. If someone charges you for SEO, you have been conned.

What pleases me more is that it echoes a comment I made in my post Verified: Google Ignores Meta Keywords back in September:

Those of us trying to protect our clients from SEO/SEM snake-oil salesmen are happy to finally have an official statement from Google.

Now that I've tooted my horn and compared myself to someone considered one of the top 40 "industry influencers" of 2007 by Folio Magazine, let me get to my point. I've been working on the web since Hot Java was still a browser, was excited when the first beta of Netscape Navigator made its way to world, when Yahoo were a couple of guys in a dorm posting links, when my Jolt Cola web site was included in their index because I asked them to include it, and since then the way people find things on the web has changed dramatically. For the last decade or so the search engine has become more than a convenience, it's a necessary feature of the web, without which we'd be stuck wandering billions of terrible pages of things we don't want to see (many thousand fewer of those pages once GeoCities finally closes down). Because of this, getting your site into the search engine in the top spot has become the holy grail of online marketing, one that far too many people are happy to exploit as an opportunity.

Derek makes two key points in his article

  1. The good advice is obvious, the rest doesn’t work.
  2. SEO is poisoning the web.

He supports the first point by noting that formatting, structure, summaries, quality links and so on have worked since the beginning and will continue to work. There's no magic there. It's free to read anywhere on the web. For his second point he references all the Google bombing tactics that are employed by bots to spam blogs, comment areas, twitter accounts, parked domains, etc. as well as questionable tactics that exploit loopholes (albeit temporary ones) in a search engine's ranking algorithm.

As of now the article has 180 comments, many of which are optimizers who take umbrage with the blanket statement that SEO is the work of the soulless foulspawn and their dark arts (my words, but I think I summarize his sentiment well enough). After receiving so many comments Derek added a post yesterday, his SEO FAQ, responding to a generalization of many of the questions and comments. He also offers some suggestions, including this one targeted at clients (I just took the first part):

If someone approaches you about optimizing your search engine placement, they're running a scam. Ignore them.

Having said something similar in the past to clients, this is normally where I'd segue into a discussion with my clients about how I've worked hard to ensure Algonquin Studios' content management system, QuantumCMS, adheres to best practices and provides many ways to get quality content into the pages, links, titles, page addresses, meta information (after I tell them Google doesn't use meta data for ranking but they insist because they've been conditioned to think that way) and so on. This is also the part where I remind clients that their help is needed to write that copy, interact with users, customers, partners, industry organizations, etc. to generate quality relationships and references (often in the form of links), and to plan to spend time working on this regularly to keep it fresh and relevant.

I look forward to the time when I won't be spending chunks of my day clearing spambots from my QuantumCMS Community forum, batting down spam email about submissions to 300+ search engines, ignoring bit.lys in unsolicited Twitter @ responses, and generally fighting the after effects of all the black hat SEO enabling we've allowed for years.

Tuesday, October 13, 2009

Come See Me: November 3

I will be one of the panelists at the Infotech Niagara panel session titled "Maximizing Your Web Presence." It takes place Tuesday, November 3, 2009 at 8:00am at Buffalo Niagara Partnership's conference room, 665 Main Street, Suite 200, Buffalo, New York 14203 (map below). BNP has parking information at their web site.

From Infotech Niagara:

Ok, you have a website, now what?

Join infoTech Niagara for a panel discussion on "Maximizing Your Web Presence." Our panelists bring years of experience in web strategy, web design, search engine optimization, social media, web video and more.

Come learn from the experts what you can do to leverage new and existing technologies to maximize the effectiveness of your web presence.

Panelists include:

  • Adrian Roselli, Senior Usability Engineer, Algonquin Studios
  • Brett Burnsworth, President, Zoodle Marketing
  • Jason Holler, President, Holler Media Productions
  • Mike Brennan, Vice President, Noobis, Inc.

Continental Breakfast will be provided.

Cost:
ITN Members: $10
Non-Members: $15
Students: $5

Register online.

View Larger Map

Friday, October 9, 2009

Wait - GeoCities Still Exists?

October 26, 2009 marks the end of an era. GeoCities, the much-maligned free hosting service offered since way back in the early days of the web, is closing down. Like Abe Vigoda, there are many who already thought it had long since fizzled out.

Dating back to 1995, GeoCities provided users with free web hosting accounts using their "neighborhoods" model and was getting over six million monthly page views. In 1997, GeoCities introduced the now infamous ads on its pages and still became the 5th most popular site on the web. At its peak as the 3rd top spot on the web in 1999 (behind AOL and Yahoo), Yahoo paid $3.65 (3.57 according to Wikipedia) billion for GeoCities. That's in billions. Some of us may remember the furor that took place when Yahoo laid claim to all rights over content (copy, images, etc.), which Yahoo eventually reversed. And then there was the dot-com bust. Over time GeoCities became synonymous with terrible, tacky web pages made up of animated graphics, under construction icons, background music, and browser download badges, among other horrors of the bygone days of the web.

Given Yahoo's recent cost-cutting measures it makes sense that the free service would be one of their offerings to go. Yahoo announced the closure on April 23 of this year, but yesterday Yahoo sent out a final notice warning to users explaining that they can move their sites to Yahoo's $4.99/month hosting service and to also warning users that on October 26 they are flat out deleting Geocities. All of it. With no chance of recovery. Period. So there.

Sadly, I don't know what I'll do without some of these GeoCities standbys, which haven't been updated in nearly a decade. That's 10 years. I carry on conversations with IRC bots younger than that.

At least we still have Angelfire to deride...

Thursday, October 8, 2009

October 6 Panel Follow-up

For those of you who attended the Business First Power Breakfast: Online Networks this past Tuesday, October 6 and have reached out to me to offer feedback and ask questions, thanks for your interest and thanks for coming. I was thrilled to see a room full of keenly interested attendees.

Based on some of the questions I have received since then I'd like to make a few comments.

First, the Social Media Revolution video I posted Tuesday afternoon is not mine. I had nothing to do with it, it was playing when I got there. It's a great overview of the growth of social media and other online technologies, but it isn't mine, I just embedded it so you could view it here in case you forgot the title.

Second, some of you realized, in retrospect, that the guy you saw taking photos of the food with his cell phone camera was me. I am surprised how many people caught on to my comments about food photos and I know some of you went looking for the photos on my Twitter feed. Unfortunately, because I use Brightkite for my photos (which posts to Twitter) and because Brightkite went down that morning for a massive upgrade (and still hasn't come back up), those photos never made it out. So I've dropped smaller versions of them below.

Tuesday, October 6, 2009

Social Media Revolution Video

If you attended the Business First Power Breakfast: Online Networks event this morning where I was a panelist, then you saw the Social Media Revolution video at the beginning and end of the panel. In case you didn't catch the address or title, here's the video.

If you were at the event this morning and came here as a result of hearing me speak, thanks for coming. If you weren't at the event this morning then you had better get me in to speak to your organization already.

Monday, October 5, 2009

List of URL Shorteners Grows Shortener

One of the many URL shorteners has announced that it is shutting down in just under three weeks. Cli.gs has announced, via its blog, that it will no longer accept new URLs to be shortened as of October 25. It will also stop logging analytics. Cli.gs will still forward URLs at through November, but no guarantees have been made beyond that. Since Cli.gs has been a one-man show serving tens of millions of forwards a month and the site owner is funding it from his own pocket, it's not too much of a surprise

Back in early August another URL shortener called it quits, Tr.im. The argument Tr.im posted on its page was essentially that there is no way to monetize URL shortening and without money to be made, it's hard to justify further development. Tr.im committed to supporting their Tr.im links through December 31, 2009. However, shortly after that post, Tr.im claimed to be resurrected. Since that announcement, Tr.im has started its progression into an open-source community owned endeavor. Its success is now at the mercy of how much time developers are willing to donate.

Both URL shorteners have, either directly or indirectly, referenced the lack of revenue from their business model, the number of competitors, and the fact that Twitter has standardized on Bit.ly, making it the de facto leader in the Twitterverse, and by extension, much of social media.

URL shorteners work by taking any URL that a user provides and returning a much shorter address that users can paste into a tweet (renowned for the 140 character limit), email messages, or other places where a terribly long link might be cumbersome. Some of them just redirect users right to the target page while some, like the not-so-short Linky.com.au, load the target page into a frame with their brand (and statistics and other link information) sitting at the top of the viewport. In the future, this brand area could be used to serve ads, possibly generating revenue.

The major complaint with URL shorteners is the fact that users cannot always see where the link will take them, which can be a boon for all those fake Twitter accounts trying to link users to porn (sorry, pr0n). This has led to browser add-ons or Twitter applications that can extend the URL as a tool-tip before clicking and being surprised by a screen (and speakers) full of NSFW content.

My additional complaint is that URL shorteners, and specifically their demise, leads to link rot on the web. If any of these services were to shut its doors tomorrow and stop all redirections, the links using those services all become dead-ends. In the case of tweets, we're talking about millions and millions of tweets that will become nonsense (moreso than many of them are now). For emails, articles, or anything else that relies on these URL shorteners, they also become orphaned references without context. If the Mayans had it right, that could be the End of Days we're all expecting in 2012.

Even if you don't believe in the great Linkpocalypse, you should take a few moments to read up on link rot at Wikipedia (an article which is sadly devoid of any mention of URL shorteners).