On Google and MetaFilter
Posted by: Matt SalerLast month, Matt Haughey of MetaFilter published a piece on Medium about the status of one of the true stalwarts of the internet:
MetaFilter is the little weblog that could, established in 1999 as one of the first community blogs. Over its fifteen year history it has expanded from a place to discuss interesting things on the web to include Ask MetaFilter as a community question and answer (Q&A) site, along with more subsections for things like music by members, completed projects by members, meetups among members, and most recently TV and movies.
While MetaFilter is relatively small (only about 62,000 have paid the one-time $5 for an account to date and 12,000-15,000 of those members come back to interact with the site every day), we have a great group of members, and I think we consistently have some of the best discussion on the web, with the sites attracting over 80 million readers last year. Our commenters are literate and thoughtful, and our site is watched around the clock by a staff of moderators. Despite the site’s modest stature its influence makes waves in the larger world (like mentions on popular TV shows: Tremé andMythbusters).
Unfortunately in the last couple years we have seen our Google ranking fall precipitously for unexplained reasons, and the corresponding drop in ad revenue means that the future of the site has come into question.
Haughey goes on to explain the situation at length.
His story is alarming for a number of reasons. MetaFilter is highly-regarded, with a reputation as a good citizen on the internet and has a community that is generally one of the best. The realities Haughey and his staff now face this are brutal and should have been avoidable.
However, the nature of Google in 2014 is such that determining where MetaFilter went wrong (or even if they went wrong) is difficult.
What we do know is that, as the company has grown over the years, Google has become increasingly opaque and monolithic. It has also become seemingly hostile to some of the principles it was originally known for. The famous but informal Google motto “Don’t be evil,” has become a punchline to some observers, particularly after the sudden shuttering of Google Reader and the intrusive full-court press on Google+. This MetaFilter story only adds to the narrative of a Google that is losing sight of the open internet, where the cream rises to the top.
At the surface, the MetaFilter situation is related to positive changes Google has made to their algorithm aimed at reducing instances of low-quality content and serving up high-quality content. However, if you do a Google search today, you still get content farm results and low-quality answers from sites like Ask Yahoo! If this were a case of MetaFilter losing out to quality competition, or utilizing unsavory SEO techniques Google is flagging, it wouldn’t be so disturbing—it would be expected. As it is, you have MetaFilter losing out for reasons that remain opaque to the outside.
Here you have an extreme case that offers a harsh lesson: you can do all of the right things and still lose out if you are too dependent on a third party company for your success. It’s true with Facebook and other social media sites, and it’s true with Google too.
MetaFilter, for example, is now pivoting to another, less ad-dependent business model, but it seems they should have done so earlier as a safeguard against the shifting sands of Google’s algorithm.
Our guidelines for customers remain what they were before: keep your SEO efforts clean and produce good content while striving to engage your users in a meaningful and lasting way. Just remember that what happened to MetaFilter underlines the importance of the engagement component. Search results can change, but a proper web strategy can minimize the damage to your business when they do.
For some more background on the story, check out this episode of the TLDR podcast.
Preventing Spam Without Captcha
Posted by: ElexiconSpam emails are extremely annoying. Unfortunately, spambots are getting smarter and smarter every day. People have developed some pretty clever methods to prevent spam, but the most popular are also an inconvenience to your users. I’m speaking, of course, about Captcha.
According to the Captcha website, Captcha is
“a program that protects websites against bots by generating and grading tests that humans can pass but current computer programs cannot.”
Typically, the test will consist of distorted text embedded in an image.
But what if the user can’t read the distorted text produced by Captcha? It becomes a nuisance to hit the refresh button multiple times to get a legible Captcha so you can submit the form. Being in the user-experience business, we went looking for a better solution. We looked into several different technologies, but almost all of them were too bloated in size for us to find them appealing, so we had to come up with another way.
First, some explanation about spam bots: they will typically fill out every input field in a form whether or not it is visible to the user. This useful piece of information has led to the creation of the honeypot method. The honeypot method consists of putting a blank input field in your form and hiding it from the user. The bot will come across this input field and fill it in. If the form field is filled in, the sender should be marked as a bot and the form should not send. Unfortunately, setting an input field with display: none;
isn’t enough to combat spam anymore. It is certainly a step forward, but as I mentioned above, spam bots are getting smarter every day and many of them have figured this little trick out and can work around it.
To prevent spam on our new forms, we used a few different “honeypot” methods combined into one form to determine if the user is a bot or not. We first implemented the standard blank honeypot text input field and set it to display: none;
. When the form is submitted, we then perform a server-side check using PHP to see if the input was filled out. If so, we trigger an error and prevent the form from sending. After testing this method, we waited a couple of days to gauge its effectiveness. Unfortunately, we were still receiving some spam emails each day using this out-of-the-box honeypot functionality.
We then began investigating more solutions to combat spam. This is where we really learned just how smart these spam bots are becoming. When we had first implemented the honeypot input field, we had named the input “anti-spam”. This was a bad idea. Bots are able to read through the input attributes and determine what type of input is focused and, apparently, are able to determine if the input field is supposed to be filled out based upon the name. So we learned a valiable lesson: name your honeypot fields something completely irrelevant to combating spam.
After changing the honeypot input name to something like “promo_code” we waited another day or two to see if we had better results. A day or two went by without spam, but on the second or third day we received another spam email. This was an improvement from the previous rate, but still unacceptable.
That’s when we realized some bots can bypass input fields set to display: none;
. So diving headlong deeper into the spam battle, we implemented a method found in MailChimp’s subscription form that sets the honeypot input field to position: absolute; left: -5000px;
. This allowed the input field to be “visible” but positioned off screen so the normal user couldn’t see it. We weren’t going to stop there, though. We’d had enough of those emails trying to sell us shoes and prescription drugs.
To be sure no bots could send our contact form, we implemented second and third honeypot inputs. The second honeypot field was an HTML5 email input. I do believe this field was the key in preventing our spam emails. Bots, no matter how smart, will ALWAYS fill out an email input. Just be sure to name your honeypot email input field something different than your actual email input field and name it something totally unrelated to spam prevention. We used the name “email_2” for our honeypot email input. This email input was “hidden” the same way the other input field was, by setting it in a <div>
with position: absolute; left: -5000px;
set. If either one of these input fields were populated, the form would trigger an error and not send.
The third and final method we used to combat spam is something a little more in-depth and tricky than your average honeypot input. When a spam bot finds a form on a page it will typically fill it out within 5-10 seconds and submit. This is much faster than what a human can do, so we figured it would be wise to do a check against how long it took to fill out the form. To do this, we put in a hidden input field and set the value to populate upon page load with the time the page was loaded. When the form is submitted, we perform a server-side check if the time between loading the page and submitting the page is larger than the minimum time it takes to fill out the form. We set this minimum time to 10 seconds to be sure we weren’t going to prevent any real users from sending our form. If the form is submitted in under 10 seconds, it will trigger and error and not send.
It’s been just over a week since we’ve implemented these spam prevention techniques and we haven’t seen a single spam email come through since. As spam bots continue to evolve, however, we may have to revisit this solution down the line. But that’s part of the spam arms race that’s not going away any time soon.
New Research in Repetition is a KICK
Posted by: Mike VerstratNPR’s Alix Spiegel recently reviewed the research of Elizabeth Margulis, Director of the Music Cognition Lab at the University of Arkansas. Margulis took the rather free-form and non-repetitive music of Luciano Berio, a 20th century composer, and chopped it up. Her cuts were intentional, copying a component, and adding it in another location to create repetition where before there wasn’t any.
The whole point of this effort was to simply see if people liked the music more or less with repetition baked in. An extensive, random sampling of people evaluated the before and after pieces.
The results were clear:
“(The Subjects) reported enjoying the excerpts that had repetition more,” Margulis says. “They reported finding them more interesting, and — most surprising to me — they reported them as more likely to have been crafted by a human artist, rather than randomly generated by a computer.”
Spiegel’s interview with Margulis further highlights the role of repetition in music as a whole, and why this became such a passionate topic of study:
“A full 90 percent of the music we listen to is music we’ve heard before. We return again and again to our favorite songs, listening over and over to the same musical riffs, which themselves repeat over and over inside the music, and she (Margulis) became obsessed with understanding why repetition is so compelling.”
One key ingredient that draws people to repetition is labeled the mere exposure effect which basically describes how people feel better about something the more they encounter it. Margulis sums it up this way:
“Let’s say you’ve heard a little tune before, but you don’t even know that you’ve heard it, and then you hear it again. The second time you hear it you know what to expect to a certain extent, even if you don’t know you know,” Margulis says. “You are just better able to handle that sequence of sounds. And what it seems like [your mind is saying] is just, ‘Oh I like this! This is a good tune!’ But that’s a misattribution.”
Margulis also explains that the innate desire for repetition crosses boundaries of time and culture:
“Musical repetitiveness isn’t really an idiosyncratic feature of music that’s arisen over the past few hundred years in the West,” she says. “It seems to be a cultural universal. Not only does every known human culture make music, but also, every known human culture makes music [in which] repetition is a defining element.”
Margulis’ study is helping fill in the picture with some clarifying implications about why we crave repetition in sound. Some commentators on her work go so far as to suggest that our craving for auditory repetition might stem from life in the womb with the constant sound and rhythm of a heartbeat surrounding us.
So what (if anything) do these findings on musical repetition mean for the world of visual communications — more specifically for those of us concerned with designing digital experiences?
I would argue it means a lot.
After all, the phenomena of repetition exists in the visual world as well as the audible. In addition to her insights on audible repetition, I think Margulis’ work might also be uncovering some underlying forces that assist visual designers and information architects with the choices they make in communicating.
The point is, just as a heart beats to a rhythm, just as the hook of a great song sways our emotions — repetition in a digital experience makes us feel we’re right where we want to be.
If we think about what we do as communicators in the digital / interactive space, we’re usually set about the task of organizing information. There’s a goal out there, an idea, a concept — we try to make it clear by emphasizing the essential and removing the extraneous through the manipulation of word and image. We strive to make the complicated simple. That’s what we do in a nutshell. But of course, doing this with success is easier said than done. As Brion points out in a recent post, “… simplicity is hard to achieve, requiring a great deal of creativity; and that complexity is easy to achieve …”
One strategy for organizing the visual arrangement of information (as far as interactive experiences go) is utilizing principles of repetition, especially in key visuals and navigation elements. We often call this consistency instead of repetition but the classifications are similar. When designing navigation, we even choose terminology to describe those elements using words that are synchronous with other similar interactive experiences (i.e. Home, About Us, Contact Us, etc.).
Think about sites you’ve visited recently. Can you recall instances where you’ve had to look all around the screen to try and track down a specific link, button, or function? How did that make you feel? Why did you look for it in the places you searched?
Arguably, you expected it to be a certain way because repetition of that way had occurred for you in the past. As Jakob Nielsen points out, an axiom to remember when developing an online experience is that “users spend most of their time on other websites”. It’s critical when designing a digital experience to be aware of the audience, and have a solid understanding of what elements they’ll expect to be repeated or consistent.
Intuitively, this all makes perfect sense. Many of our life experiences are based on repeating audible or visual patterns in time and space. The sun “rising.” Seasons. Birthdays. The wheels of your car turning. Your yearly physical (get one). The traffic light. Alarm clocks. Tides. Rows of crops. City blocks. There are things you just simply believe will be there because they’ve been there before. Repetition somehow has the power to arrest our attention, and sooth it at the same time.
Of course when it comes to preferring things repeated, there’s a limit.
Most would agree that there’s a break-point (seemingly unaddressed by Margulis’ research) where you start hearing things like, “I’m so sick of this song!” and “This ad is so overplayed!”
To be sure, there is a progression of user interface design conventions (and design conventions in general) over time. Just take a look at how things looked a short 20 years ago to realize that patterns and paradigms in UI do in fact shift, just like they do with styles and preferences in any cultural context. Additionally, experienced designers often know when it’s right to break a rule here or there in order to intentionally fragment repetition for the sake of accentuation or variety.
Still, I think the power of consistency is so strong, that comfort in knowing what to expect often trumps any need to change for change’s sake.
Take Warren Buffett’s Berkshire Hathaway corporate site for example. One could argue that it’s passed a stylistic expiration date about 18 years ago. Yet many (dare I say older investor-types) see it as navigable, simple and largely device-agnostic when it comes to usability. I’d venture to guess it would cause quite a stir (for better or worse) if we one day fire up the url (does anyone actually visit their site besides me?) and we find parallax scrolling and promotional videos duking it out for our attention.
To be sure, BH’s subsidiary groups run the gamut of site design conventions, and I’m not advocating for or against their corporate site’s cemented-in approach. I’m instead pointing out where repetitive, year-after-year consistency in an online presence seems to build more forceful inertia than change — even in a Fortune 5 company.
I think the big idea that can be taken away from Margulis’ research — as it relates to things visual — is that balancing the unifying/comforting nature of repetition with the eventual desire for variety, should first begin with an understanding of the strong need people have for wanting to know what’s coming next. If you violate that need, you’ll be asking your audience to weather the storm of uncertainty until they are able to continue navigating through your information — that is, if they choose to stay with you at all versus bailing out and going somewhere else.
So weigh those risks before breaking consistency, and proceed as appropriate.
You’ve probably heard of the long-standing design acronym, “KISS — Keep It Simple, Stupid!” Maybe as a starting point to achieve simplicity, it makes sense to “KICK — Keep It Consistent, Kid!”
Self-publishing and Social Media Strategy
Posted by: Matt SalerIf you run a Facebook page for yourself or your business and you take the time to monitor the analytics, you’ve probably noticed a trend: the reach of your posts has been heading downward. Your efforts on Facebook are getting back less bang for your buck.
Maybe you’ve recently received a notice from Facebook, which offered to sell you ads to increase your reach. So rather than reach people who chose to Like your page and opted in to see your content in their News Feeds, Facebook is offering you the chance to pay for what you got for free before.
Any way you look at it, this is a classic “moving the goal posts” move by Facebook. Admittedly, they have very real audience size considerations: millions of businesses have Pages in their system, trying to reach over a billion users. Not all of those businesses can get 100% reach across all of their followers’ News Feeds without crowding out more personal connections. That’s a real problem.
But there is also this reality: Facebook is an ad-driven company that makes its money selling ads based on information users feed into the system. Facebook is not in business to help you if helping you costs them. It makes business sense for them to charge other businesses for access to a wider user base, especially after years of better access created a dependency. It’s their pipes you’re using, right?
Facebook isn’t the only platform that is changing the rules of the game. All third party tools are now or will be doing this. Like Facebook, they are not in business for you.
This is only a real problem if you’ve developed too much of a dependency on these services.
A parallel in the real world might be this: your industry has a trade show every year that everybody goes to. Vendors and customers flock to it. It’s a huge competition for eyeballs and if you handle your presence there right, your sales do really well as a result.
If you were to consider that your only chance to reach your customers or maintain sales, however, you would be missing out on opportunities the rest of the year. And if the trade show ever changes the bar for entry, your business would suffer.
It’s the same with Facebook, Twitter, Google+ or any of the other social media platforms. If you’re relying too much on them, you make yourself vulnerable to business decision they make, rather than dependent on decisions you make.
To carry the illustration forward, you can go to multiple trade shows or use multiple third party platforms. But you can’t forego the tradition sales and marketing techniques, and stop hitting the pavement. That’s where the meat is. That’s where the longevity and stability of your business lies.
In the web sphere, this means putting the focus on your own website and on your own publishing. Use those other tools, but don’t rely on them. Make your website great and use the full array of tools available to you to increase the reach of your business through platforms you own.
Facebook’s audience size problem can feel like it cuts both ways. At over a billion users, it’s easy to think they’ve got the whole internet covered. And to be sure, that’s a conversation you want to be a part of. It’s just not the only conversation. It can be intimidating to be faced with getting lost in the wider ocean of the internet with a focus on your own, owned platforms,
But as long as you go after your customers like you always used to, you’ll be okay.
Native Advertising: Or How to Spite Your Nose to Save Your Face
Posted by: Matt SalerWe spend our lives immersed in advertising, indirectly or directly applied. From the branded clothes we wear and the badged cars we drive to the commercials we wish we could skip while watching live TV and the blinking banners we try to block in our web browsers, we’re confronted with consumer choices presented by advertisers and brands pretty much constantly.
For businesses, finding the balance between tastefully promoting a product or service and overselling it has been a challenge met with varying degrees of success. You can probably think of examples from either end of the spectrum.
For consumers, the line between tasteful, acceptable promotion and annoying overselling can depend on personal limits, but in general, the preference is to learn about a company’s offerings in the least obtrusive way possible.
The accommodation of that consumer preference is behind what is paradoxically one of the most intrusive and worrying advertising trends on the web: native advertising.
Also called “sponsored content”, native advertising is mostly seen on content-driven websites like The New York Times, Buzzfeed, The Atlantic and other sites where journalism is practiced in one form or another. Native ads are pieces of content that mostly appear at a glance to be articles produced by the host site, but in reality are either ads directly produced by an advertiser or ads produced on behalf of an advertiser by writers employed by the host site.
How these pieces are called out as distinct from actual content varies, but the idea behind this mode of advertising is to keep it unobtrusive and maintain the reader’s experience. That’s because declining revenues from traditional online advertising have demonstrated a failure of the more obvious ad display options. Users either ignore or block text and banner ads. Advertisers know this and rates go down as a result, which put the long-term viability of online ad-supported publishing in doubt, especially for sites that don’t have an effective paywall in place.
So some sites have turned to native advertising. These ads offer advertisers a way to get in under consumers’ radars, which have been tuned to ignore previous display ads. This increased reach means higher rates, which in theory means a better long-term outlook for these content-driven websites.
What these sites end up with, however, is a muddied user experience. They are risking the dilution of their own brand by incorporating advertiser content in this way. Think of it like this: would you trust your doctor’s recommendation more or less if you knew for fact she had financial motivation to suggest a given drug?
As consumers become more aware of these types of ads and start to learn to screen them out like they have the prior models, the pressure will be on the host sites to reduce the distinction between their content and their advertisers. It’s possible to follow such a trajectory down to a point where ad content is virtually indistinguishable from host site content (who reads bylines any more, right?). That would put all of the host site’s content in doubt.
Not all websites on the native content track will stick it out to that eventuality, but those that do will have bought themselves a few more years of viability at the cost of the relationship of trust they had with their own readers.
This is a game of whack-a-mole that advertisers and websites will have a hard time winning. Users will eventually recognize the signs and force a pivot to a new method of advertising that may hit the scene after trust has already been broken, making it too late for the host site. Traditional ad support for content-driven sites looks like a race to the bottom.
The revenue problem content-driven sites face is real, especially for journalism sites — as the stats show. Content production is expensive and time-consuming, but the expectation has been set by years of the open internet that content wants to be free. Native advertising is just one of the many models these sites are turning to in an attempt to staunch the wound, but it seems like a contaminated bandage if it comes at the cost of your own brand identity.
When I had more time to write my own content-driven website, a blog about the Detroit Red Wings, I accepted text ads in the sidebar. But I refused offers from advertisers that demanded I post their content or include their ads in the main content area of my site. I did that because I felt I had built a trust with my readers over the years. To inject content I couldn’t stand behind in the same place as content I had labored over felt like betrayal of that trust.
I had the privilege of doing that as a hobby and never had to rely on revenue from that site to make a living or to employ other writers. So I don’t envy the choice faced by executives at places like The New York Times, who are tasked with the long-term survival of their company. But I know there has to be another way — a way that doesn’t spite their nose to save their face.