It seems I'm starting to make it a tradition to publish an article like this every few years. The original one was freshened up by a new version in 2014, but a few new nuisances have been showing up lately, and some others have finally started to vanish, therefore here is an update. As before, the points are sorted roughly from most annoying and most relevant to least.
My hope is that many people will read this and consider it when making or updating their websites. Or at least complain to webmasters if they encounter it in other sites. But the main purpose of this text is mostly to spit my bile and vent my (and many others') frustrations. This means it will contain a large amount of sarcasm, and may seem threatening or insulting to sensitive readers. Read it at your own risk and don't come complaining afterwards. It also means it are my personal opinions and if you think you have good reasons to do any of the things mentioned here, just go ahead. After all, I'm not the Pope or anything.
If you scroll down to the bottom of the page, guess what you will find — if you are not using an ad-blocker — that's right, an ad! I don't give a damn if people block those ads. They produce more than enough revenue to cover the hosting of this website, which is the only thing I care about. The mere fact that the ad is positioned on the page in a way that would make advertising ‘specialists’ scream, already shows that I believe forcing people to watch advertisements is counter-productive. Initially I placed the ads there with the specific intent of limiting the income I would get from them, because the tax system in my country is so messed up that I might get into trouble if those ads would make a considerable amount of money. Yet, I think this kind of ad placement is not that bad at all. Think about it, when is someone more inclined to click an ad: when it is at the start of an article they would like to finish reading, meaning the ad will be scrolled out of sight by that time, or when the ad only appears just at the moment when they actually finish reading and don't immediately know what they'll do next? “Hey, that looks interesting… click!”
Now that most browsers successfully block pop-up windows, the arms race has moved towards ‘soft’ pop-ups or so-called ‘pop-overs’ that are incorporated in the websites itself, by means of HTML elements. The latest fad is to either wait a specific time after the user has loaded the webpage, or moves the mouse outside the tab or main window, and then shove a HTML-rendered overlay in their face. The overlay will either be a nag screen not to leave the website or subscribe to something, or it will just be a classic ad. It uses a glass-screen effect to prevent the visitor from doing anything before the pop-up has been dismissed.
This is yet another type of nuisance that will prove counter-productive in the long term. I do not want to be interrupted while I am reading an article. And when I move the mouse outside the window, I am not necessarily planning to leave the page. This gimmick gets especially old if the website cannot remember that the pop-up has already been shown a few times during this same browsing session, and it keeps on annoying the visitor who has already clicked away the obnoxious thing five times. I wouldn't mind if this kind of pop-up is shown once, but if I have dismissed it twice already, it surely is not likely that I'll fall for it the third time, so please stop trying.
This is one of the most prevalent fads that started around 2012 and is still going strong in 2017. I load a page and see no indications of page numbers or how much items there are. The only indication is a moderately-sized scroll bar in my browser. I happily start scrolling, assuming I can quickly get an overview of the entire page. But when I almost reach the bottom, it suddenly sprouts new items and the scroll bar jumps up. As I continue scrolling, this happens over and over again. It never ends and I get the feeling I am inside Zeno's paradox, or one of those feverish nightmares that seem to have no end. The phenomenon is called ‘infinite scrolling’ and some believe it is the logical evolution onwards from pagination. I beg to differ.
In ancient times people stuck paper together until it was one immensely long ribbon. Terribly unwieldy, therefore they rolled it up into scrolls. Accessing information inside the scroll involved transferring paper from one roll to the other until the desired information showed up. Unpractical, time-consuming, and difficult to estimate how much information was in a scroll. Then people invented books with pages. Scrolls got out of fashion very quickly because one can bookmark in a book and jump to a page immediately after looking it up in an index. A similar evolution happened with audio and video. A cassette is basically a scroll in a box, and has the same problems. This is why since the advent of random access media like CDs, hard drives, and solid-state memory, almost nobody uses cassettes anymore.
The web never had a prehistoric sequential data period. It has always been a collection of separate pages with hyperlinks. Bookmarking a page used to be trivial, and a quick glance at the scroll bar when opening a page revealed how much information it contained. For some reason web designers decided this lack of a prehistoric period has to be compensated for. To be trendy and hip, a modern website must now mimic ancient text and multimedia storage systems. Just imagine that when viewing YouTube, you need to fast-forward through a dozen videos of kittens and people performing dumb antics until you find what you want, as was the case with a VHS tape. Why would something similar be justified for text-based content?
It is not hard to figure out where it all went wrong and why history seems to be going in reverse here. The two main culprits are touchscreen devices and the general inability of humans to cope with more than a single paradigm, or perhaps utter laziness of web designers. It is rumoured that Twitter was one of the first services to introduce infinite scrolling. There it does make some sense because tweets are short and nobody really wants to scroll beyond the first dozen newest tweets. On many other websites however it doesn't. Infinite scrolling also somewhat makes sense on touchscreen devices that can only be controlled by pushing and swiping one's meat sticks across a display. Scroll bars are awkward on such devices and eat away the often scarce screen real estate, so Apple Inc. got the idea of almost entirely obliterating them. On a device with the aforementioned limitations this was an OK trade-off between ease of use and ease of retrieving data. On a desktop PC or laptop it makes no sense. If you want to offer infinite scrolling on mobile devices, fine. But please at least give people with less constrained computing devices the option of pagination.
Next to making content difficult to find, infinite scrolling pages also have a risk of choking the browser. Ironically, this risk is highest on mobile devices which generally have limited computing power compared to a desktop PC. Therefore the option to view content in paginated form is useful even on a mobile website.
Infinite scrolling is also horrible for visitors with reduced accessibility. I don't even know whether screen readers can cope with it, maybe they will only present the first chunk of loaded data. Even if they can keep loading more content like a regular browser, it must be pretty horrible for anyone with accessibility problems to find anything in the endless flood of data with no means to jump to any particular part. Again, it will be similar to having to find a song on a cassette of which you don't even know beforehand what is on it and how much.
If you still believe infinite scroll is the best thing since sliced bread, this presentation by Dan McKinley shows what can happen when blindly making the assumption that infinite scroll will improve your particular website just because it appeared successful for others.
What bothers me a lot about the much-hyped era of “web 2.0” is that it has spawned an endless series of very similar websites where form is favoured over function. There is the typical blog with some photo at the top, which might be intended to give the website an identity. Unfortunately this identity gets completely lost due the site template being used at a gazillion other sites. Even if the photo was made by the authors themselves and is not a stock photo, it obediently follows Photography 101 rules making it utterly forgettable like the rest of the site.
Then there are variations on this theme where more crap is added that makes the site even more forgettable. Tag clouds, photo streams, tweets, social media ‘like’ buttons and random faces of people who liked this page scattered everywhere. Ads or surveys that pop up after viewing the page for ten seconds. Somewhere amidst this crap might be the actual content, but sometimes it is a challenge to find it, and people might have clicked one of the dozen distracting elements before getting to that point.
Both problems are the result of copy-paste web design where frameworks are piled on top of each other, and used improperly such that resources start to leak and are never cleaned up. The designers of such websites are skilled enough to string the frameworks together, but not to make a neat design and validate it through testing. The website may look nice at first sight, but eventually it will kill the computer or mobile device that tries to keep it open for an extended period. Next time you want to ‘enhance’ your website, ask yourself whether it really is necessary. Sometimes, less really is more.
With the advent of CSS, the possibility to use “
text-decoration: none” arose. This means that people can actually turn off the underline for hyperlinks, which has been the default since the invention of the web browser. This is not bad as such, as it allows to tailor the appearance of links to a custom web design. However, many people like to use this to make the links totally identical to the surrounding text. Only when the user hovers over them with the mouse they become visible, and sometimes even that has been disabled! The only pointer that the word is clickable in the latter case, is the cursor changing to a hand icon.
Now tell me, do you believe anyone wants to scan every word in every webpage with the mouse, in order to detect where the wonderfully hidden links are? No! Links must be visible at first glance. Whether with an underline, different style, colour, or whatever. Blue-coloured underlined text is burned so deeply into the common knowledge of people, that it is the ideal way to indicate links. On any webpage where the hyperlinks are the main feature (for instance, a search results page,) both underlines and the blue colour must be used. On other pages it is OK to drop either the blue colour or the underline, but never both. Neither underlines nor blue text should be used for anything else than links, unless their meaning is clearly indicated. There is never a good reason to make links invisible, except in a game “find the hidden links”. There is one hidden link in this very paragraph. Did you find it?
In this regard, the new layout that Google rolled out for its search results around March 2014 baffled me. They removed the underlines, in what seems to me a change just for the sake of change. They seem to have attempted to bring the layout more in line with the current awful trend of whiteout-inducing user interfaces, that feature text floating around in a sea of white space with no separators anywhere. The new result page rubs me in such a wrong way that I am starting to actively avoid using Google. Bad move, guys.
Suppose I enter a website through a hyperlink or a search engine, but it is in too deep a level, e.g.
Now imagine I want to see all rubber duckies, but I can't immediately find a navigation link to go to that higher level. The logical thing to do, is to cut off all the parts of the URL after ‘rubberduckies’. In other words, go to
In a well-designed website, I would then arrive at the page with the overview of all rubber duckies. But in many cases I get ‘permission denied’, although I am pretty certain that I have permission to see all rubber duckies. I might also get a ‘404 not found’ or heaven forbid, ‘500 internal server error’ depending on the skills of the web developers. This forces me to go back all the way to the main page and re-traverse the navigation structure. That sucks. To prevent this on a basic static web server, it is as simple as naming the main page of each subdirectory ‘index.html’ (or ‘index.htm’ or ‘index.php’). In more advanced servers, there certainly always are ways to make sensible redirects pointing to the kind of page a visitor expects to see when entering a certain URL path.
This has been a topic since my original article, but ironically it has recently become much worse due to the practice of ad-blocker-blockers I discussed above. Therefore I now limit this to the case where plain poor webpage design causes unimportant content to block the actual content.
Some sites somehow manage to include non-essential content like banner ads, word clouds, social media widgets, or bookmark buttons, which are implemented so badly that browsers are unable to render the rest of the page before this content has been loaded. If the content cannot load at all, visitors see nothing until they wait for a timeout. If the ‘stop’ button is pressed before this timeout, the page will often stay blank. When I visit a webpage with some text on it, I expect to be able to read that text. I do not care much about being able to ‘like’ the text or linking it to some trendy online tool that I won't be using anymore in a few months anyway. Mostly I just give up and go to another site if I can't see anything within reasonable time, and most other people will do the same.
There appear to be some websites whose whole purpose is to pretend to have useful content, but actually to load as much fringe garbage as possible around what is only a very superficial remnant of the purported content. Obviously, those sites are all about making people click ads or dodgy money-making schemes (hmm, is there any difference?) There should be some general quality mark for every website, and such sites should get negative marks.
The only reason why this could be justified, is to prevent ‘deep linking’, i.e. other sites linking directly to pages deep within the site. Then again, there should be nothing wrong with that, it is the whole idea of the internet. Any properly designed site shows its identity on each webpage, with a navigation system that allows the visitor to go to the home page. Even in sites with frames, where a deep link would not load the entire frameset and leave the visitor stranded without navigation, this technique is still not justified. Of course frames are utterly obsolete, but in case someone is still forced to use them. It only takes a little bit of scripting to detect a deep link and reload the entire frameset with the linked page inside it, instead of going for the lazy solution of kicking the visitor to the home page.
Imagine you are cozily surfing the web, with your desktop neatly arranged, and your browser window perfectly tuned so you can see enough of the websites while still keeping an eye on other windows next to the browser. Then suddenly you enter a site and POP! It blows up the browser to full screen, making it explode in your face. I really, really, cannot think of any reason why anyone should do this. Except maybe to express:
hey, my site is so great that I am sure you'll want to view it full screen, so I make it full screen for you! Unfortunately all the sites I encountered that did this, sucked. Moreover, I want to decide myself how large my browser window should be. If a site only looks good in full screen, then it sucks from the start.
I wonder how many rotting corpses of dead epilepsy patients are lying behind computer monitors right now, with an ad still flashing:
you are the 10000th visitor of this site! There's also variation where
you are the 999999th visitor, and the ad is throbbing and shaking as if it is being held in the hands of someone who drank ten too many cups of coffee. Of course everyone is the ‘10000th’ or ‘999999th’ visitor of that site, but that doesn't matter. What matters, is that flashing and shaking things are an ideal visitor repellant, and the repulsion force is proportional to the area and frequency of the flashing thing. Especially on pages where the user is supposed to read even a small piece of text, fast moving and flashing things are not done. If there is nothing else of any importance on a page, and the page is only to be seen during a short period, then it may be OK to put something flashing on it. But please, no flashing junk on any page with more than two sentences of text!
The first is to prevent visitors from right-clicking images to save them. This is a childish and dumb idea. First of all, the image has already been downloaded, otherwise the user wouldn't see it. By right-clicking and choosing ‘save as’, the user just moves the already downloaded copy somewhere else. Prohibiting this is like an insult to those visitors by making them feel like criminals who would not understand a warning like:
it is forbidden to use images on this site for any other purpose than personal viewing. Saving an image somewhere to be able to view it offline when your site is long gone, is still personal viewing and is a fair right.
Moreover, right-clicking is used for many other things than saving images or viewing HTML source. For instance, opening links in a new tab/window, bookmarking them, looking up text, … Sabotaging these abilities is a sure way to piss of your visitors and chase them away.
In the old days of the Internet it was simple: the
certain operating system was Windows. The number of sites whose main content was usable in Mac OS or Linux only was negligible, if not inexistent. But that did not matter because they would have sucked for exactly the same core reasons. When someone designs a website in such a way that it only works in operating system A with browser B, they are a bit like a virtual racist. Especially when done conspicuously, by redirecting people who do not use the über-software to a page that bluntly states they use the wrong system, without even giving them a chance to try it with their ‘inferior’ system.
I had hoped these practices would disappear with increasing maturity of the Internet, but this might have been a vain hope. Microsoft has finally lost their monopoly on web browsing, but now other companies are gaining user base. While those companies previously did an effort to follow widely adopted standards because this is by far the best strategy for an underdog, now their increasing market share risks pushing them into the same arrogance as MS exhibited with Internet Explorer. And this arrogance again risks spilling over to website designers.
Imagine you have a shop and 100 potential customers. You pick out ten of those customers, based on the fact that they wear shoes of a certain brand. And you put two bodyguards at the entrance, with the order to prevent those people from entering your store at all cost, even though your very best customers might be among those ten. Does that make sense? As much as taking 10% of your income in the form of paper money, and eating it. Or dancing naked in a town square, holding a fresh herring in each hand.
Making your site cross-browser, cross-platform is not as hard as you may think, and makes much more sense than locking out a group of people because they prefer to use a different system than you. Putting a label
Designed for / Best viewed with browser B on your site is nothing more than telling:
I am too lazy or incompetent to make my site work for everyone, or:
I believe we're still in the year 1997. It is OK to include an extra gimmick on your site that will only work in a certain browser, but it is not OK to make this gimmick the core part of your site.