Showing posts with label SEO. Show all posts
Showing posts with label SEO. Show all posts

Wednesday, April 22, 2015

On the Mis-Named Mobilegeddon

If you are a web pro then it is likely that you heard that Google's search results were going to change based on how mobile-friendly a site is (you probably heard a couple months ago even). This change took effect yesterday.

As with almost all things in the tech world that affect clients, the press hit yesterday as well, and today clients are looking for more information. Conveniently, our clients are golden as we went all-responsive years ago.

If you already built sites to be responsive, ideally mobile-first, then you needn't worry. Your clients have probably already noticed that the text "mobile-friendly" appears in front of the results for their sites in Google and have been comforted as a result.

If you have not built sites to be responsive, or have had no mobile strategy whatsoever, then you may be among those calling it, or seeing it referred to as, mobilegeddon. A terrible name that clearly comes from FUD (Fear, Uncertainty and Doubt).

If you are someone who relies on a firm to build and/or manage your site, then you should also beware the SEO snake oil salesman who may knock on your door and build on that very FUD to sell you things you don't need.

From Google Webmaster Central

For that latter two cases, I have pulled the first three points from Google's notes on the mobile-friendly (a much better term) update. I recommend reading the whole thing, of course.

1. Will desktop and/or tablet ranking also be affected by this change?

No, this update has no effect on searches from tablets or desktops. It affects searches from mobile devices across all languages and locations.

2. Is it a page-level or site-level mobile ranking boost?

It’s a page-level change. For instance, if ten of your site’s pages are mobile-friendly, but the rest of your pages aren’t, only the ten mobile-friendly pages can be positively impacted.

3. How do I know if Google thinks a page on my site is mobile-friendly?

Individual pages can be tested for “mobile-friendliness” using the Mobile-Friendly Test.

From Aaron Gustafson

Aaron Gustafson put together a simple list of four things you as a web developer can do to mitigate the effects of Google's changes, though the simplicity belies the depth of effort that may be needed for some sites. I've collected the list, but his post has the details for how to approach each step:

  1. Embrace mobile-first CSS
  2. Focus on key tasks
  3. Get smarter about images
  4. Embrace the continuum

What Is Your Mobile Traffic?

I've been asked how to find out how much traffic to a site is from mobile users. In Google Analytics this is pretty easy:

  1. Choose Audience from the left menu.
  2. Choose Mobile once Audience has expanded.

Bear in mind that this just tells you where you are today. If that number drops then it may be a sign that your mobile strategy isn't working. At the same time, if that number is already low then it may not drop any further owing to unintentional selection bias in how your pages are coded.

Oh, By the Way

Google isn't the only search engine. When I mentioned that on this blog before, Google had 66.4% of the U.S. search market. As of January 2015, that's down to 64.4%. Bing is up from 15.9% to 19.7%.

Google Sites led the U.S. explicit core search market in January with 64.4 percent market share, followed by Microsoft Sites with 19.7 percent and Yahoo Sites with 13.0 percent (up 1.0 percentage point). Ask Network accounted for 1.8 percent of explicit core searches, followed by AOL, Inc. with 1.1 percent.

While I Have Your Attention

Two days after the initial announcement of this change, word also came that Google is working on a method to rank pages not by inbound links, but by trustworthiness, in essence by facts.

When this finally hits, pay attention to those who refer to the change as Truthigeddon. Be wary of them.

Wednesday, June 12, 2013

Google Needs to Provide Android App Interstitial Alternative

Yesterday Matt Cutts from Google tweeted that Google search results for users on smartphones may be adjusted based on the kinds of errors a web site produces (of course I was excited):

Matt links to a page that outlines two examples of errors that might trigger this downgrade of a site's position in the Google search results and, right in the first paragraph, links to Google's own common mistakes page:

As part of our efforts to improve the mobile web, we published our recommendations and the most common configuration mistakes.

I think it's fair to assume that anything listed on the "Common mistakes in smartphone sites" page can negatively impact your site ranking. In particular this section on app download interstitials caught my eye:

Many webmasters promote their site's apps to their web visitors. There are many implementations to do this, some of which may cause indexing issues of smartphone-optimized content and others that may be too disruptive to the visitor's usage of the site.

Based on these various considerations, we recommend using a simple banner to promote your app inline with the page's content. This banner can be implemented using:

  • The native browser and operating system support such as Smart App Banners for Safari on iOS6.
  • An HTML image, similar to a typical small advert, that links to the correct app store for download.

I think it's good that Google links to the Apple article. I think it's unfortunate that Google does not link to Microsoft's own solution. If you read my blog regularly, or just follow me on Twitter, you may know that I covered both Apple's and Microsoft's app banner solution in January in the post "App Store Meta Tags."

You might also note that I stated that Google Play offers no such feature. Google, the force behind Android and the one now (or soon) penalizing sites in its search engine for app interstitials, provides no corresponding alternate solution of its own.

A great thing that Google could do for its Android users, for its search engine results, and for app developers, is to support a custom meta tag that allows web sites to promote their own Android apps in the Play store. Developers can start to replace awful Android app interstitials on web sites, users can get a cleaner experience, and site owners who can't conceive of other ways to promote their apps on their home pages can move toward something that is easier to maintain and doesn't penalize them.

I think it's nice that Google is paying attention to web devs by adjusting search results, but my ranty tweets are probably falling on deaf ears. The web would be indebted to someone who can get Google's and Android's ear on this.

Tuesday, November 6, 2012

Social Media Profile versus a Web Site

We paid $3,000 in Facebook ads last year to attract some new fans. Now, with this fancy [promote] button, we can pay $3,000 more for those fans to actually see our updates.
This image gleefully stolen from The Page That No One Will Ever See. Now it may be a seen page.

Yesterday an eye-catching headline popped up in my Twitter feed: 6 Reasons Facebook and Twitter Are More Important Than a Website (which is a different message than the author's "infographic" that suggests users find Facebook more useful than a brand's site). I have been down this road more than once, but I thought I would follow the link and see what those six well-thought, sound business reasons must be.

1. Websites Require Constant Maintenance

Given the immediate nature of social media, a traditional marketing web site needs far less maintenance than trying to engage followers. I argue that many users expect content on a web site to be relatively constant, updated as appropriate and, in the case of some web applications, automated to a degree.

A Twitter account that pushes out content every few days, however, might be considered slow. One that pushes content every few minutes can be an assault to a follower's timeline. One that doesn't respond to tweets from users might be considered disrespectful.

Contrast this with a Facebook page that has some traction and has many fans. When those fans post to the brand's wall or comment on posts from the brand, there is an expectation of a quick response from users, which requires constant vigilance to keep users from feeling like they are being ignored.

The author also claims web sites can cost between $50 and $5,000 dollars to build, but makes no effort to identify how much a social media resource costs to maintain profiles and fresh content across multiple social media outlets. This assumes a business isn't so clueless that interns are considered good resources for representing the entire brand on social media.

Sticking with the cost argument, I think the author hasn't been paying attention to recent Facebook changes in the form of promoted content.

2. Social Media Is Scalable

The author seemingly assumes most web sites are hosted on servers under desks. Granted, the real point is that a web site may not be able to handle traffic from a random viral traffic spike.

This may very well be true for some sites, but given how many sites are hosted on, and get resources from, content delivery networks and national hosts, the need to scale can often be handled with a phone call to a hosting provider to kick the site into the next hosting bracket. One cannot call Twitter or Facebook when it has been overloaded and demand it scales up for your traffic.

Interestingly, pages on my own site have suffered far more downtime as a result of embedded content from Twitter and Facebook. When they suffer the inevitable and random "fail whale," their poorly-written scripts can take down my entire page. At least when social media platforms are on the fritz, I can still direct users to the rest of my web site for information.

3. Websites Require Specialized Knowledge

I am actually a little sad this point isn't true. With the preponderance of WYSIWYG editors, export-to-web features, free platforms like Blogger or Wordpress with pre-built themes, it's far too easy for someone without specialized knowledge to get his or her message out there. And this is a good thing.

The author does make a point that to have a truly unique site with modern standards such as HTML5 and CSS will require someone with skill to do it for you. Oddly, the alternative he proposes is to use exactly the same Facebook or Twitter layout as everyone else. And I can personally guarantee it won't be built to modern standards such as HTML5 and CSS.

To be fair to social media, almost no web site claiming to be built to modern standards actually is either.

4. Your Customers Are Already on Social Media

Really? He knows that? He has run a series of surveys, done market analysis, engaged my users directly and determined they are on social media? And he found more than 15% of US adult web users are on Twitter?

For my target audience, he is right. Although that's by accident. I can also rattle off plenty of businesses (including my own clients) who don't know that for sure, haven't done the research, aren't in a position to, and can even guess that it's still not true.

The assumption in the article is that users are already inundated with web addresses. He argues that somehow a link to a Facebook page can percolate above all that, that even a Twitter hashtag will make sense to more users. The logic is that users are already on social media, so they'll just go right to your message.

Nevermind that your target users may be in a demographic that doesn't use social media. Or your business may not be a fit for social media. Or that there are still more web users than Facebook users (even if you include the thousands and thousands of fake accounts). Or that there is already enough noise in my Twitter and Facebook feed I don't see stuff from my real-life friends.

5. You’ll Be Easier to Find

Using SEO as a dirty word (well, it is), the author suggests that it's hard to find things on the web. He says social media platforms have their own search already, so if you just focus there you will be found much more readily.

To make an anecdotal argument here, which is abnormal for me but curiously appropriate in this case, I can tell you that if I want to find a brand or person on Twitter or Facebook, I go to Google first. Google provides a far better search for me than I can get in Facebook's or Twitter's results, partly because both Twitter and Facebook are too busy trying to pitch me or assume I know their lingo. If I'm not logged into either one, it's an overall useless experiment. If I am trying to research a product or service, then Facebook and Twitter are the last places I'll go.

Given how readily Twitter suggests people or brands I should follow that are either promoted, of no interest, or that I have already unfollowed, I would not rely on the discovery method of gaining new followers. Given how Facebook has changed its model to require you to pay to get your message in front of fans and their friends (promoted posts), I wouldn't rely on discovery there, either.

If you dismiss the value of a search engine to help users find you and rely solely on the search and discovery features of social media, then you are painting yourself into a corner. Twitter use won't generate enough content over time for all your targeted phrases (unless you constantly assault followers) and neither will Facebook, because they both push older content down, out of the view of search engines.

6. Facebook and Twitter Facilitate Content Creation

Yes, they do.

When I am particularly angry at a brand, I go right to their Facebook wall and post my issue. I also approach them on Twitter. In some cases, I hijack their hashtags. I create all sorts of content about how much that brand has disappointed me. The brand may respond and make it right, but my words are out there, getting indexed by Google, being associated with the brand.

But that's not what the author means, he means (his words) content can often be generated through the simple click of an upload button. Regardless of the fact that you need someone to take that photo, craft that caption, be available to respond if people engage with it, and even hope that anyone cares, he's telling us that content is free and writes itself.

Which it doesn't. Otherwise I wouldn't have had such a good turnout (and feedback) at my content strategy session at the local WordCamp.

Wrap-up

Only in the closing paragraph does the author suggest that maybe you might still need a web site and maybe you might benefit from Twitter and Facebook. So I have to ask myself, why didn't he lead with this? Why are the hollow arguments told strictly from the perspective of spending you effort on Facebook and Twitter to the detriment of your site? Because he's a social media services peddler.

If the author truly believed that Twitter and Facebook are more important to have than a web site, then I look forward to when he demonstrates that belief by shutting down his site and moving it all to Facebook and Twitter. Until then, it's a poorly-argued sales pitch.

Related

These are posts I've written that go into more detail on some of the points I raise above. Traditional web sites easily have as many issues and more, but that's not what this discussion is about.

Tuesday, October 30, 2012

Confusion in Recent Google Updates

Google pushed out some updates recently which have had SEO experts and spammers, as well as the average web developer or content author, a bit confused. It seems that some sites have been losing traffic and attributing the change to the wrong update. It also seems that some of this has percolated up to my clients in the form of fear-mongering and misinformation, so I'll try to provide a quick overview of what has happened.

Exact Match Domains (EMD)

For years identifying a keyword-stuffed domain name for your product or service was considered the coup de grace of SEO. Frankly, on Google, this was true. For instance if my company, Algonquin Studios, wanted to rank highly for the search phrases web design buffalo or buffalo web design then I might register the domains WebDesignBuffalo.com and BuffaloWebDesign.com. I could even register nearby cities, like RochesterWebDesign.com, TorontoWebDesgin.com, ClevelandWebDesign.com, and so on, with the intent to drive traffic to my Buffalo-based business.

Google has finally taken steps to prevent that decidedly spammy user-unfriendly practice. With the EMD update, Google will look at the domain name and compare the rest of the site. If the site is a spammy, keyword-stuffing, redirection mess, then it will probably be penalized. If the domain name matches my company name, product or service and (for this example) is located in the area specified by the domain, then it will probably not experience any change.

In all, Google expects this will affect 0.6% of English-US queries.

Panda/Penguin

While spammers panicked about this change, some not spammy sites noticed a change at about the same time. This may have been due to Panda and Penguin updates that rolled out around the same time and have been rolling out all along.

Considering the Panda update was affecting 2.4% of English search queries, that's already a factor of four more of an impact than the EMD update. Considering that Google pushes out updates all the time, tracing one single update to any change in your Google result position is going to be tough.

A couple tweets from Matt Cutts, head of the web spam team at Google, help cut to the source instead of relying on SEO-middle-men to misrepresent the feedback:

This one details the number of algorithm changes that regularly happen:

The trick is trying to recognize what on your site might have been read as spam and adjust it to be user-friendly, not to try to tweak your site to beat an ever-changing algorithm.

Disavowing Links

This one ranks as confusion for a web developer like me.

The only feature Google has added that I think takes potential fun away from blogs (or any site that allows commenting) is the tool to disavow links. This tool allows a site owner to essentially tell Google not to count links pointing at it when figuring PageRank.

One reason I don't like it is that it allows sites that have engaged in black-hat SEO tactics and have ultimately been penalized by Google to undo the now-negative effects of paid links, link exchanges and other link schemes that violate Google's terms. While this is good for sites that have been taken to the cleaners by SEO scammers, I still don't like how easily they could be excused.

Another reason I don't like it is that all those liars, cheaters, scammers, spammers, thieves and crooks who have spam-posted to my blog can go and disassociate those now-negative links to their sites. Sadly, associating their sites with filth of the lowest order by careful keyword linking (as I have done at the start of this paragraph) is the only ammo I have with which to take pot-shots at their spam juggernauts.

This new tool means you might not see spammers harassing you to remove their own spammy comments from your blogs. Which is unfortunate, because ignoring them seems only fair.

Just this morning Matt Cutts tweeted a link to a Q&A to answer some questions about the tool:

The post includes some answers intended to address concerns like mine.

Meta Keywords, Redux

As I have said again and again, the use of meta keywords is pointless in all search engines, but especially in Google. This doesn't stop SEO snake-oil salesmen from misrepresenting a recent Google update to their prospects.

Last month Google announced its news keywords meta tag, which does not follow the same syntax that traditional (and ignored) keyword meta tags follow. An example of the new syntax:

meta name="news_keywords" content="World Cup, Brazil 2014, Spain vs Netherlands"

From the announcement, you can see this is clearly targeted at news outlets and publishers that are picked up by Google News (your blog about cats or your drunk driving lawyer web site won't benefit):

The goal is simple: empower news writers to express their stories freely while helping Google News to properly understand and classify that content so that it’s discoverable by our wide audience of users.

For further proof, the documentation for this feature is in the Google News publishers help section.

In short, unless your site is a valid news site, don't get talked into using this feature and fire the SEO team that tries to sell you on it.

Related

Monday, October 22, 2012

SEO Isn't Just Google

This past weekend I had the pleasure of participating in Buffalo's first WordCamp for WordPress users. Before my presentation I made it a point to sit in on the other sessions that were in the same track as mine.

When discussing SEO, all the sessions I saw mentioned only Google. The Google logo appeared throughout, Google's PageRank was discussed, Google search result screen captures were used, and so on.

The presenters for an SEO-specific session even went so far as to embed a video of Matt Cutts (from Google) in their presentation and declare that Matt Cutts stated that WordPress is the best platform for SEO.

For context, Matt Cutts appeared at a WordCamp in May, 2009 to discuss his search engine (Google) for an audience using a particular platform (WordPress). Matt even said, WordPress automatically solves a ton of SEO issues. Instead of doing it yourself, you selected WordPress (at about 3:15 in the video). He's pitching his product to a particular audience to validate their technical decision (he's just saying they don't need to manually code these tweaks).

If while watching that video you heard Matt Cutts declare that WordPress is the best platform for SEO, then you are engaging in selection bias.

This same selection bias is also happening when developers work so hard to target Google and not any other search engines. If you convince yourself that Google is the only search engine because you don't see other search engines in your logs, then perhaps you are the reason you don't see those other search engines.

To provide context, this table shows the ratio of searches performed by different search engines in August 2012 in the United States. These are from comScore's August 2012 U.S. Search Engine Rankings report.

Google Sites 66.4%
Microsoft Sites 15.9%
Yahoo! Sites 12.8%
Ask Network 3.2%
AOL, Inc. 1.7%

It's easy to dismiss 16% when you don't know how many searches that translates to.

More than 17 billion searches were performed in August 2012. Google ranked at the top (as expected) with 11.3 billion, followed by Microsoft sites (Bing) at 2.7 billion. The breakdown of individual searches per engine follows:

Google Sites 11,317,000,000
Microsoft Sites 2,710,000,000
Yahoo! Sites 2,177,000,000
Ask Network 550,000,000
AOL, Inc. 292,000,000

To put this another way, for every four (ok, just over) searches using Google, there is another search done in Bing. For every five searches using Google, there is another one done using Yahoo.

If your logs don't reflect those ratios in search engines feeding your site, then you need to consider if you are focusing too hard on Google to the detriment of other search engines.

Now let's take this out of the United States.

Considering Bing's partnership with the Chinese search engine Baidu, contrasted with Google's battles with the Chinese government, it might be a matter of time before Bing tops Google for Asian searches. Given the size of the Asian market (over half a billion users), if you do any business there it might warrant paying attention to both Baidu and Bing.

Related

Update: May 15, 2013

Bing is now up to 17%, having taken almost all of that extra point from Google.

Friday, August 26, 2011

We Really Still Have to Debunk Bad SEO?

Image of bottle of SEO snake oil.I've been doing this web thing from the start (sort of — I did not have a NeXT machine and a guy named Tim in my living room) and I've watched how people have clamored to have their web sites discovered on the web. As the web grew and search engines emerged, people started trying new ways to get listed in these new automated directories, and so began the scourge of the Search Engine Optimization (SEO) peddler.

The web magazine .Net posted what to me is a surprising article this week (surprising in that I thought we all knew this stuff): The top 10 SEO myths. I am going to recap them here, although you should go to the article itself for more detail and the full list of reader comments. Remember, these are myths, which means they are not true.

  1. Satisfaction, guaranteed;
  2. A high Google PageRank = high ranking;
  3. Endorsed by Google;
  4. Meta tag keywords matter;
  5. Cheat your way to the top;
  6. Keywords? Cram 'em in;
  7. Spending money on Google AdWords boosts your rankings;
  8. Land here;
  9. Set it and forget it;
  10. Rankings aren't the only fruit.

The problem here is that for those of us who know better, this is a list that could easily be ten years old (with a couple obvious exceptions, like the reference to AdWords). For those who don't know better or who haven't had the experience, this might be new stuff. For our clients, this is almost always new stuff and SEO snake oil salesmen capitalize on that lack of knowledge to sell false promises and packs of lies. One of my colleagues recently had to pull one of our clients back from the brink and his ongoing frustration is evident in his own retelling:

I have a client who recently ended an SEO engagement with another firm because they wouldn’t explain how they executed their strategies. Their response to his inquiry was to ask for $6,000 / month, up from $2,000 / month for the same work in two new keywords.

This kind of thing happens all the time. I recently ran into another SEO "guru" selling his wares by promising to keep a site's meta tags up-to-date through a monthly payment plan. When I explained that Google doesn't use meta tags in ranking, his response was that I was wrong. When I pointed him to a two-year-old official Google video where a Google representative explains that meta tags are not used, his response was to state that he believed Google still uses them because he sees results from his work. My client was smart enough to end that engagement, but not all are.

Because I cannot protect my clients in person all the time, I have tried to write materials to educate them. For our content management system, QuantumCMS, I have posted tips for our clients, sometimes as a reaction to an SEO salesman sniffing around and sometimes to try to head that off. A couple examples:

Along with these client-facing tips I sometimes get frustrated enough to write posts like this, trying to remind people that SEO is not some magical rocket surgery and that those who claim it is should be ignored. I've picked a couple you may read if you are so inclined:

And because I still have to cite this meta tags video far far too often, I figured I'd just re-embed it here:

Related

My ire doesn't stop at SEO self-proclaimed-gurus. I also think social media self-proclaimed-gurus are just the latest incarnation of that evil. Some examples:

Tuesday, August 16, 2011

Thoughts on Muse (Obvious Pun Avoided)

Muse logo.I downloaded and installed Adobe's new web design tool, Muse (code name) (also at Adobe Labs) out of morbid curiosity. Just like Adobe Edge (which refuses to launch), I had very little expectation that this would be a fully-developed sales-ready product. Instead of getting into extensive detail about the quality of its code, its accessibility support, and so on, I figured I'd do a very quick review of how I think it affects web developers.

The target audience is pretty clear from the Muse (code name) web site introduction:

Create websites as easily as you create layouts for print. You can design and publish original HTML pages to the latest web standards without writing code. Now in beta, Muse [(code name)] makes it a snap to produce unique, professional websites.

And this:

Design your pages
Focus on design rather than technology. Combine images and text with complete control, as flexibly and powerfully as you do in Adobe® InDesign®.

Right there is the gist of the product — enable print designers to convert their designs into web pages. Just like Photoshop would produce massive image slices to support Photoshop "designers," this product isn't about the code. With its integration of jQuery effects and Lightbox widgets, it seems almost like this would be a tool for a photographer to build a gallery site.

If you are a coder, or someone who cares about the code, this tool isn't for you. You will quickly see that the HTML is produces is not exactly structural or semantic, and that the piles of CSS and JavaScript aren't exactly necessary. Muse (code name) doesn't allow you to edit the HTML, so you still need to "publish" your work before you can edit it. If part of your coding process involves making your HTML meet accessibility standards or even just structure your content for good SEO, you will find it impossible.

If you are part of a web team, perhaps in an ad agency or interactive firm, then you will find that this tool doesn't allow you to collaborate well. If you get a favicon, for example, from another member of your team, Muse (code name) cannot import it; it only accepts PNG, GIF or JPG. If you receive a background image to drop into an element, Muse (code name) will crop the image, even increasing its dimensions to fill the element, regardless of your plan to allow more of the image to be revealed should the container change in size.

If you find yourself pasting HTML code from Google Maps or Twitter in order to embed third-party widgets on your site, you may find that is nigh impossible short of publishing your pages and then hacking through the HTML output. While I did not find a menu option to do that, even if it exists it will require a full "publish" step every time you want to tweak your embed code.

If you find yourself leaning on CSS techniques as simple as printable styles or as complex as media queries to support alternate display sizes, you will be disappointed. This tool is not intended to support liquid designs, adaptive layouts, document re-flow, or really anything related to alternate viewing.

If you support a web content management system, then for all the reasons above Muse (code name) is not a good fit. Just building a single page to use as a template will require a great deal of work to reformat the code to fit into most content management systems that are out there. Should you ever have to change the core template you either have to go back to Muse (code name) and repeat the process, or you will have to skip Muse (code name) for all future revisions.

In short, it comes down to these two key points:

  1. Muse (code name) has the potential to be a great tool for the single graphic designer interested in showing off his or her work without having to learn a technology outside of his/her knowledge area (nor worry about accessibility, standards, alternate displays, SEO, etc.);
  2. If you are a web developer (or web development firm), your job is not at risk. Muse (code name) is making no effort to replace you. If anything, it might keep you from getting fewer calls from people who might not be good clients anyway.

If you are looking for a pedantic review of the HTML output, I suspect plenty of others will cover that. Since Muse's (code name) target audience won't care, and anyone who does care will already know the issues just by playing around, it's not even worth getting into here. Besides, with 120,000 other people downloading Muse (code name) after the first day, I expect plenty of reviews of the markup will come.

Now to Examples!

These aren't intended to be open shots at Muse (code name), but instead I hope someone at Adobe can use them to help better it overall.

Photo of the Muse (code name) UI on my netbook.

This image shows how Muse (code name) looks on my netbook (I may have tweeted my frustration last night). As you can see, the menus are off the top of the screen along with every other useful feature. I was able to close the application thanks to standard keyboard shortcuts.

Screen shot of my sample page.

Using the default page size settings (960 pixels with a min-height of 500 pixels), this is the sample site I quickly cobbled together. I did not start with a design or goal other than throwing some elements on the page, so don't tell me my site isn't as awesome looking as it could be. Because it couldn't be awesomer.

What about the file output you ask? Here is the /css directory:

File name Size (bytes)
articles.css 5,106
bio.css 5,106
blog.css 5,106
books.css 5,106
contact.css 5,106
ie_articles.css 5,009
ie_bio.css 5,009
ie_blog.css 5,009
ie_books.css 5,009
ie_contact.css 5,009
ie_index.css 5,009
index.css 5,106
site_global.css 4,305

The duplicates are for IE support and you can expect to see all your content in every page twice as it relies on IE conditional comments to serve up one copy for IE9 and one copy for anything below IE9.

Here is the /scripts/0.9 directory:

File name Size (bytes)
jquery-1.4.4.min.js 78,766
jquery.museMenu.js 2,382
MuseUtils.js 9,317
SpryDOMUtils.js 14,604

Without those script files, those simple-looking menus on my example just don't render.

That background image I mentioned earlier? Muse (code name) re-cropped it and converted it to a PNG file, increasing both the dimensions and file size:

File name Size (bytes) Dimensions
Banner_bg.jpg 11,271 627 x 80 original image
master_U355_full.png 41,800 960 x 97 Muse (code name) -ified image

Related

Monday, June 27, 2011

Social Scoring As the New SEO

Change in my Klout score over 30 days

Lately I have noticed that Klout is getting a lot of traction in discussions about social media. It may be that there is just more coverage, or the name has started to penetrate to more users, or the idea of social scoring is becoming more interesting to marketers. It's also possible that I am keyed into it because I don't really believe the scoring model is truly accurate. On this last point, you can see my blog post from Tuesday, Does Your Klout Score Mean Anything? to understand my skepticism.

My skepticism was not assuaged when I noticed that my Klout dashboard (yes, I finally signed up) claims that my score has gone up 2 points in the last 30 days, justified by the handy plot Klout provides of that same period, even though I know 30 days ago my score was 7 points lower (I took a screen capture). It appears that after I added my Facebook and LinkedIn profiles to Klout, it retroactively adjusted my score. Which sounds very much like revisionist history and further erodes my confidence in Klout's platform.

Despite this clearly math-free scoring model, Mashable opened an article (with the technically accurate, but still unfortunate page address of klout-gate) last week with this unsurprising statement:

Your Facebook influence, as measured by Klout, will determine your access level to select brand pages on Facebook — and it could net you perks.

This weekend The New York Times published Got Twitter? You've Been Scored where it examines the scoring trend and what it means for consumers.

Companies with names like Klout, PeerIndex and Twitter Grader are in the process of scoring millions, eventually billions, of people on their level of influence [...] [T]hey are beginning to measure influence in more nuanced ways, and posting their judgments — in the form of a score — online.

The New York Times article takes it a bit further and outlines specific examples in the real world where a Klout score can have an impact on the perks, benefits and discounts a consumer might receive from brands:

More than 2,500 companies are using Klout's data. Last week, Klout revealed that Audi would begin offering promotions to Facebook users based on their Klout score. Last year, Virgin America used the company to offer highly rated influencers in Toronto free round-trip flights to San Francisco or Los Angeles. In Las Vegas, the Palms Hotel and Casino is using Klout data to give highly rated guests an upgrade or tickets to Cirque du Soleil.

While I believe this has significant parallels to SEO in its early days, the key difference is that this impacts individual consumers, not organizations trying to promote their web sites. The direct correlation with a tangible cost (whether by discount or freebie) also appeals to an individual more readily, since you don't need an accountant and a balance sheet to figure out the benefits.

Since Klout score is partly driven by your Twitter follower count, Twitter replies, Twitter retweets, Facebook friends and comments, and now LinkedIn activity (with the promise of Foursquare in the future), we can expect to see the recent scourge of Twitter follower guarantees (among others) make a comeback. Within a company, you might have an individual tasked with wading through these detrimental sales pitches, but the average consumer who wants a discount at a hotel might not understand that a promise of thousands of Twitter followers may ultimately guarantee a significant reduction in one's "score."

If you read this blog, then you are either technically capable, fancy yourself a social media expert, or think you know something about SEO (or you're my mom — Hi Mom!). Now would be a good time to take those skills and try to help your friends and family. Look for the same trends we saw in SEO spam ("Submit your site to hundreds of search engines!") and social media spam ("Twitter followers guaranteed!"), only with promises tweaked to the new target market (consumers).

Just as some companies wear their Google Page Rank with pride (or shame), we may start to see individual people do the same with the per-person analogue — their social influence score.

Thursday, March 3, 2011

Recent(ish) News on Google, Bing, SEO/SEM

Google Logo I have written many times here about SEO/SEM and how so much of it is sold to organizations by scam artists (though I recoil at the thought of calling them "artists"). Too often it includes demonstrably false claims, like how meta keywords and descriptions will help your site and that you should invest in the SEO vendor to do just that.

I also try hard not to spend too much time addressing the ever-changing landscape of the search engines, let alone focusing on just one of them. However, sometimes it's worth wrapping up some of the more interesting developments because they can genuinely affect my clients who aren't trying to game the search engines.

Content Farms and Site Scrapers

If you've spent any time searching through Google you may notice that sometimes you get multiple results on your search phrase that look the same in the results, but when visiting the site you find they are just ad-laden monstrosities with no value. Sometimes one of these spam sites would appear higher in the Google search results than the site from which the content was stolen.

Google has now taken steps to not only push those sites back down to the bowels where they belong, but also to penalize those sites. These changes started in late January and went through some more revisions at the end of last month.

I think it's fair to expect Google to keep tweaking these rules. Given all the sites that offer RSS feeds of their content (along with other syndication methods), it's likely that many sites integrate content from external sites into their own. The trick here will be for Google to recognize a site that has a original content that also syndicates third-party content from a site that has nothing but content taken from elsewhere. If you do syndicate content, then you should be sure to what you site stats and your ranking in the search results to see if you are affected at all.

Additional reading:

Page Titles

Perhaps you have spent a great deal of time carefully crafting your page titles (specifically the text that appears in the title and which displays in your browser title bar). Perhaps you have noticed that in Google the title you entered is not what appears on the search results page. This isn't a bug, doesn't mean your site was indexed improperly, and doesn't necessarily mean your page title had some other affect on your page rank. This is done intentionally by Google.

This does imply, however, that your titles are unwieldy. Google does this when titles are too short, when they used repeatedly throughout a site, or when they are stuffed with keywords. If you find that your title is being cut off (implying it's too long) then you may want to limit your title to 66 characters, or at least put the most important information in those first 66 characters.

Additional reading:

Social Media

It wasn't that long ago that Google and Bing said that links in social media (think Facebook and Twitter) will affect a site's position in search results (PageRank for Google). Some people may even be tempted to run out and post links to every social media outlet they can find, hoping that the more inbound links, the better for their site. Thankfully it's not that simple.

Both Google and Bing look at the social standing of a user when calculating the value of an inbound link. This can include number of followers (fans/friends on Facebook), number followed, what other content is posted, how much a user gets retweeted or mentioned and a few other factors. In short, those Twitter accounts that come and go in a matter of hours that tweet a thousand links into the ether aren't doing any good. A good social media strategy that is garnering success, however, should also give a boost to the sites it links.

What is not clear, however, is how URL shorteners (and which ones) affect the weight of those links.

Additional reading:

Random Bits

These are some random articles I collected for posts that never happened. I still think there's good stuff in these and warrant a few minutes to read.

Google: Bing Is Cheating, Copying Our Search Results and Bing: Why Google's Wrong In Its Accusations should be read together. The accusation from Google that Bing is stealing its search results is fascinating on its own, but reading Bing's response demonstrates a host of things Bing also does differently. For me it was an entertaining battle, but that's about it.

HuffPo's Achilles Heel discusses how Huffington Post relies on questionable SEO techniques, which I equate to spamming, and wonders how long the site will be viable if AOL isn't willing to keep up the SEO game as the rules change. It could be a great purchase for AOL, or a dead site full of brief article stubs.

Is SEO Dead? 1997 Prediction, Meet 2009 Reality is a two-year-old article dealing with a twelve-year-old argument. And still relevant.

When A Stranger Calls: The Effect Of Agency Pitches On In-House SEO Programs should be particularly interesting to people who are charged with some form of SEO within an organization. Too often the unsolicited call or email comes in making grandiose promises and citing questionable data and results. This article provides a good position from which to push back and make sure you and your employer aren't taken to the cleaners.

A 3-Step SEO Copywriting Confession almost sounds like an admission of wrongdoing, but instead talks about how to structure your content for SEO without completely destroying it.

Additional reading (that I wrote):

Thursday, December 9, 2010

Negative Reviews Can Now Affect Site Rank Downward

Panel from New York Times cartoon from the article.

One of the ongoing truths about search engine optimization (SEO) has been that inbound links are usually a good thing. This has caused SEO scammers to abuse the practice by creating and using "link farms," sites that exist solely to link back to client sites. This "spamdexing" technique is based on having many of these sites (hundreds, thousands) with nothing but links. Quite some time ago Google, Yahoo, and other search engines from the mists of history all recognized this bad practice and started penalizing sites listed on those indices.

When you get SEO spam email claiming that the spammer will list your site in 300 (or some other large number) search engines, this is often what they mean. If you can name more than three search engines, you are already ahead of most Internet users and you recognize that 50, 100, 300, etc are all untenable numbers. If you get an email from someone saying he or she likes your site, has linked to it, and wants you to link back, it's probably just another link farm.

Sadly, with the proliferation of community and social media sites that allow users to post comments and rate organizations, we see a repeat of the comment-spamming model that has caused blogs (among others) to implement CAPTCHA and other Draconian measures to try and hold back the tide of comment spam. As the adage "all press is good press" has led us to believe, coverage of any sort is good for business. That also means that sites that track comments about business, such as Epinions, Get Satisfaction and others like them, can end up boosting an organization's rank in the search engines even when customers complain about the organization. Let me restate — if people had anything to say about you, including bad things, they were giving you a bump in search engine results.

Cue an article in the New York Times, A Bully Finds a Pulpit on the Web, which tells the story of a woman who purchased a pair of glasses from a company that appeared at the top of the Google search results. Not only did she not get the product she wanted, it took a decidedly Single White Female turn and became a real life game of harassment from the vendor. The motivation for the vendor to behave this way is pretty clear from a comment on Get Satisfaction, posted by the very person harassing the customer.

Hello, My name is Stanley with DecorMyEyes.com. I just wanted to let you guys know that the more replies you people post, the more business and the more hits and sales I get. My goal is NEGATIVE advertisement.

When you see comments like these from "Michael" on Get Satisfaction, you can see how Michael's constant "complaints" are really doing nothing more than acting as link machines back to the offender's site. You decide if Michael is a real person or just part of the link spamming.

While this is an extreme case, it was enough for Google to take notice. On December 1 Google announced that it has made changes to its search algorithm (Being bad to your customers is bad for business):

[W]e developed an algorithmic solution which detects the merchant from the Times article along with hundreds of other merchants that, in our opinion, provide an extremely poor user experience. The algorithm we incorporated into our search rankings represents an initial solution to this issue[...]

Google makes some fair points about blocking (or lowering the rank of) an organization's site outright that has negative commentary associated with the organization. In that scenario, many politician sites would fare poorly. Competing organizations can engage in a war of defamation on third party sites. And so on.

What's key about Google's statement is the phrase "extremely poor user experience." This goes beyond just poor customer service and defective products, and can now capture sites where people complain about the design or the usability. I am one of those people who has reached out to a site to complain about a technical or implementation problem (yes, I am that jerk) and, when faced with no response, have taken to the critique sites to restate my case (complaint, whining, whatever). If you get enough user experience (UX) designers to complain about a site's ability to confound, or enough disabled users to complain about a site's inaccessibility, those can now impact a site's overall Google rank.

As you read the Times article, remember that even if your organization would never behave that way, if your site is impossible to use and people say so on opinion sites, then you could fall into the same bucket.

While you're considering that, make sure your site loads quickly, too (see the last link below).

Related

Monday, November 15, 2010

Google Instant Preview Overview

Google Instant Preview showing how my site appears when using my name as the search term.

It's like I'm running out of novel titles or something.

With the launch of Google Instant a couple months ago, the landscape for SEO changed as now site authors had to consider targeting keywords for search results that appear as the user types (see my post at evolt.org: Google Instant and SEO/SEM). Anyone who thought that Google would still for a while from there was wrong. Just last week Google introduced Instant Previews, which now shows search users the preview of a page before visiting it (see the announcement at Beyond Instant results: Instant Previews). The embedded video from Google below provides an overview of the feature.

To be fair to others, Google didn't exactly pioneer this, but as the biggest player in the market, Google's implementation can have an effect. As a result, expect to see people react to it in different ways. Many users, however, may never know the images are available since they will have to click the magnifying glass icon (or use the right arrow key) to activate the feature.

One key feature of Google's implementation is that it highlights the keywords in the screen shot, showing users just what the search term hit within the page. This is particularly handy to see if your search term is hitting some main content, or just some ancillary content on the page, potentially saving users extra clicks and time spent reading through a site. In looking at my own site, I can see that it highlighted the introductory text instead of the giant h1 with my name in the banner.

Since showing the preview to the end user doesn't get reported in a site's Google Analytics data, it will be very hard to measure how effective the preview is at getting users to visit your site.

Before you get excited that as a user that you can now spot SEO spam sites before clicking a link, don't get your hopes up. Google offers support for a meta tag that allows you to exclude a site from the preview (read the Instant Previews entry on the Google Webmaster Central blog for the scoop on other bits for web developers):

Currently, adding the nosnippet meta tag to your pages will cause them to not show a text snippet in our results. Since Instant Previews serves a similar purpose to snippets, pages with the nosnippet tag will also not show previews. However, we encourage you to think carefully about opting out of Instant Previews. Just like regular snippets, previews tend to be helpful to users—in our studies, results which were previewed were more than four times as likely to be clicked on. URLs that have been disallowed in the robots.txt file will also not show Instant Previews.

SEO spammers will likely use this, including domain name squatters and people who squat on misspellings of common web addresses. In time, users may even come to recognize that a lack of a preview is akin to hiding something.

If your site relies heavily on Flash or third-party plug-ins, you may also need to spend some time testing how it looks in Google Instant Preview. There is a reasonable chance that the preview will have broken puzzle piece icons in place of the Flash element. In the words of Google:

Currently, some videos or Flash content in previews appear as a "puzzle piece" icon or a black square. We're working on rendering these rich content types accurately.

This screen capture of the preview from Lifehacker with an embedded video, which almost ironically show how to use Google Instant Preview with your keyboard, shows what happens when you have plug-ins embedded in the page.

Google Instant Preview showing the Lifehacker page with a broken puzzle piece icon in place of an embedded movie.

You may also want to skip splash pages or interstitials (the modern splash page for within a site). A splash page may become the preview, and that could end up keeping users from following the link to your site. I've been railing against splash pages for well over ten years (see my article showing developers how to allow users to skip them from over 11 years ago: Let the User Skip the Splash Page), but perhaps this new feature of Google can help push more developers to abandon them.

Time will tell how well this feature is received by end users, and more importantly, how developers consider making changes to their sites as a result. I predict it's just a matter of time (probably backward in time, since I suspect this has already happened) that someone implements a user agent sniffer to detect the Google crawler and serve up different styles to optimize the layout of the page for the preview size and format. If I wasn't writing this post I'd probably be working on that instead.

Thursday, September 16, 2010

Google Instant and SEO/SEM (at evolt.org)

Image of Google Instant

Written this past weekend and live today, my latest article at evolt.org is available: Google Instant and SEO/SEM

There's quite the potential for change that this seemingly simple user interface change could have, both on user behavior and money spent on SEM/SEO. The next few weeks may prove to be very interesting.

If you are new to Google Instant (what's that, you're in one of those corporate environments that standardized on IE6 ten years ago and you cannot upgrade?), you can get an overview at the official announcement on the Google Blog, or you can watch the (self-congratulatory) video Google provided.

Wednesday, June 9, 2010

Google Caffeine Is Live

Illustration showing difference between old and new Google index.

Late yesterday Google announced the launch of its new web indexing system, Caffeine. According to Google, it provides 50% "fresher" results and is the largest collection of web content it has ever offered.

The big push for this new indexing technology is the rise (explosion, really) of real-time data on the web. With Twitter acting as a de facto news source for so many, and with other services such as location-based social media, rapid-fire blogs, mailing list archives and the like pushing data live at an ever-increasing rate, Google's old model of updating once a week (or every two weeks) just wasn't cutting it. We've seen Twitter integrated into Google search results already, but that has been strapped on to results that were otherwise out of date.

Google provided the image you see at the top of this post as an example of how the old index worked compared to how Caffeine works. Unless you speak blocks and the Bohr model, it really doesn't say much (except that I worry about the little guy in the cloud taking a camera to the head). The gist is, the old index was built in layers, with some content being refreshed more regularly than other content. Sites like CNN might live at the top of the index, being refreshed regularly. Sites like my personal pages might live at the very bottom, only being refreshed when someone goes in with a shovel to scrape the cruft off the edges of the database.

With Caffeine, they've flipped all that over. The crawler is always out, always indexing, and it's always updating the index. Whether it makes it to your site today or tomorrow is unknown, but now at least it's more likely to be picked up sooner rather than far far later. This is how Google describes it:

Caffeine lets us index web pages on an enormous scale. In fact, every second Caffeine processes hundreds of thousands of pages in parallel. If this were a pile of paper it would grow three miles taller every second. Caffeine takes up nearly 100 million gigabytes of storage in one database and adds new information at a rate of hundreds of thousands of gigabytes per day. You would need 625,000 of the largest iPods to store that much information; if these were stacked end-to-end they would go for more than 40 miles.

As a searcher, you may find that you are getting more timely results more regularly. The effect may not be immediate, however. I have already been trying it out, but my tweets aren't really a good example.

Monday, April 12, 2010

Your Site Speed to Affect Its Google Rank

Google LogoIf you've been paying attention to the world of SEO and the intersection with Google, then you may have heard a few months back that Google was considering using the speed of a site to affect a site's rankings. Google has already factored in the speed of a site when considering its AdWords quality score.

On Friday, Google announced that it is now implementing site speed as a factor in organic search rankings. What this means is that if your site is an extremely heavy download or just takes too long to draw, then it may be penalized in the organic search listings.

While Google doesn't explicitly define site speed, it's safe to assume that it is a combination of overall page size (including files) and render time (including server response and time to draw the page). For those developers who seem incapable of posting anything smaller than a 1Mb image in the banner, or slimming down their HTML be removing all the extraneous cruft, this is motivation to start working on those optimization skills, even if their sites don't feel the wrath of the penalty.

Some things to keep in mind:

  • Currently only 1% of search queries are affected by the site speed.
  • There are over 200 hundred factors used in determining page rank, and this one isn't being weighted to high that it kicks out the major ones.
  • It currently only applies to visitors searching in English (although you can expect to see them change that over time).
  • It launched a few weeks back, so if your site hasn't changed in its search engine rankings, you are probably safe.
  • Google links to a number of tools to test the speed if your site. Check out the links at code.google.com/speed/tools.html.
  • Nealry four months old now, Google Site Performance is an experimental Google Webmaster Tools Labs feature that shows you latency information about your site.

Hopefully few of you are concerned by this. If you are following best practices, you are already striving to have your public-facing sites draw quickly. Not only does this do things like reduce the load on your servers, it also cuts down on your overall bandwidth costs. An additional advantage is that you don't have to rely on your end user having a fast computer, lots of RAM (or swap space on the drive), and a fast connection. Given how many people surf in corporate environments that aren't exactly cutting edge, this is just good practice.

Related Articles:

Thursday, October 29, 2009

Reminder: See Me Speak, Tues. Nov. 3

I will be one of the panelists at the Infotech Niagara panel session titled "Maximizing Your Web Presence." It takes place Tuesday, November 3, 2009 at 8:00am at Buffalo Niagara Partnership's conference room, 665 Main Street, Suite 200, Buffalo, New York 14203 (map below). BNP has parking information at their web site.

From Infotech Niagara:

Ok, you have a website, now what?

Join infoTech Niagara for a panel discussion on "Maximizing Your Web Presence." Our panelists bring years of experience in web strategy, web design, search engine optimization, social media, web video and more.

Come learn from the experts what you can do to leverage new and existing technologies to maximize the effectiveness of your web presence.

Panelists include:

  • Adrian Roselli, Senior Usability Engineer, Algonquin Studios
  • Brett Burnsworth, President, Zoodle Marketing
  • Jason Holler, President, Holler Media Productions
  • Mike Brennan, Vice President, Noobis, Inc.

Continental Breakfast will be provided.

Cost:
ITN Members: $10
Non-Members: $15
Students: $5

Register online.

View Larger Map

Wednesday, October 14, 2009

Derek Powazek on SEO as Snake Oil

There are many on the web who will recognize the name Derek Powazek. He is the name behind old-school sites such as Fray.com and Kvetch.com (which has apparently been taken over by spam bots) and wrote a book about communities (Design for Community, which mentions me by name, which is awesome). I also had the pleasure to meet him at SXSW back in 2001 and even participate in his Fray Cafe. So when I saw his blog post on SEO that started off with this statement, I was pleased:

Search Engine Optimization is not a legitimate form of marketing. It should not be undertaken by people with brains or souls. If someone charges you for SEO, you have been conned.

What pleases me more is that it echoes a comment I made in my post Verified: Google Ignores Meta Keywords back in September:

Those of us trying to protect our clients from SEO/SEM snake-oil salesmen are happy to finally have an official statement from Google.

Now that I've tooted my horn and compared myself to someone considered one of the top 40 "industry influencers" of 2007 by Folio Magazine, let me get to my point. I've been working on the web since Hot Java was still a browser, was excited when the first beta of Netscape Navigator made its way to world, when Yahoo were a couple of guys in a dorm posting links, when my Jolt Cola web site was included in their index because I asked them to include it, and since then the way people find things on the web has changed dramatically. For the last decade or so the search engine has become more than a convenience, it's a necessary feature of the web, without which we'd be stuck wandering billions of terrible pages of things we don't want to see (many thousand fewer of those pages once GeoCities finally closes down). Because of this, getting your site into the search engine in the top spot has become the holy grail of online marketing, one that far too many people are happy to exploit as an opportunity.

Derek makes two key points in his article

  1. The good advice is obvious, the rest doesn’t work.
  2. SEO is poisoning the web.

He supports the first point by noting that formatting, structure, summaries, quality links and so on have worked since the beginning and will continue to work. There's no magic there. It's free to read anywhere on the web. For his second point he references all the Google bombing tactics that are employed by bots to spam blogs, comment areas, twitter accounts, parked domains, etc. as well as questionable tactics that exploit loopholes (albeit temporary ones) in a search engine's ranking algorithm.

As of now the article has 180 comments, many of which are optimizers who take umbrage with the blanket statement that SEO is the work of the soulless foulspawn and their dark arts (my words, but I think I summarize his sentiment well enough). After receiving so many comments Derek added a post yesterday, his SEO FAQ, responding to a generalization of many of the questions and comments. He also offers some suggestions, including this one targeted at clients (I just took the first part):

If someone approaches you about optimizing your search engine placement, they're running a scam. Ignore them.

Having said something similar in the past to clients, this is normally where I'd segue into a discussion with my clients about how I've worked hard to ensure Algonquin Studios' content management system, QuantumCMS, adheres to best practices and provides many ways to get quality content into the pages, links, titles, page addresses, meta information (after I tell them Google doesn't use meta data for ranking but they insist because they've been conditioned to think that way) and so on. This is also the part where I remind clients that their help is needed to write that copy, interact with users, customers, partners, industry organizations, etc. to generate quality relationships and references (often in the form of links), and to plan to spend time working on this regularly to keep it fresh and relevant.

I look forward to the time when I won't be spending chunks of my day clearing spambots from my QuantumCMS Community forum, batting down spam email about submissions to 300+ search engines, ignoring bit.lys in unsolicited Twitter @ responses, and generally fighting the after effects of all the black hat SEO enabling we've allowed for years.

Tuesday, October 13, 2009

Come See Me: November 3

I will be one of the panelists at the Infotech Niagara panel session titled "Maximizing Your Web Presence." It takes place Tuesday, November 3, 2009 at 8:00am at Buffalo Niagara Partnership's conference room, 665 Main Street, Suite 200, Buffalo, New York 14203 (map below). BNP has parking information at their web site.

From Infotech Niagara:

Ok, you have a website, now what?

Join infoTech Niagara for a panel discussion on "Maximizing Your Web Presence." Our panelists bring years of experience in web strategy, web design, search engine optimization, social media, web video and more.

Come learn from the experts what you can do to leverage new and existing technologies to maximize the effectiveness of your web presence.

Panelists include:

  • Adrian Roselli, Senior Usability Engineer, Algonquin Studios
  • Brett Burnsworth, President, Zoodle Marketing
  • Jason Holler, President, Holler Media Productions
  • Mike Brennan, Vice President, Noobis, Inc.

Continental Breakfast will be provided.

Cost:
ITN Members: $10
Non-Members: $15
Students: $5

Register online.

View Larger Map