## Welcome!

Hi there! I'm Julian M Bucknall, a programmer by trade, an actor by ambition, and an algorithms guy by osmosis. I chat here about pretty much everything but those occupations. Unless you're really lucky...

Most recently this is what I've come up with:

## Making your web pages fast (part one)

Recently, I had occasion to want to read an article on <a well-known development company>’s developer blog. It took, believe it or not, over 17 seconds to load and display on my wired connection, around 10 seconds longer than I would have waited if I hadn’t have wanted to read the content. Apparently on a phone it took over 60 seconds to load. I ran it under Firebug because I just didn’t believe it and wanted to see what would take so long. This is the tweet I sent:

So this one blog post used 117 HTTP requests for various files (HTML, CSS, JavaScript, images, whatever) from 16 separate web servers. It took a smidge under 7 seconds just to generate and receive the initial HTML page (from which all the other requests would be derived). It was a grand total of 17 seconds before the browser signaled the onload event (after which a whole bunch of scripts would run, etc). All in all, pretty bad. And in reality a lot of this can be avoided with just a little more care.

When we navigate to a web page, we tend to have certain expectations. We assume that the page renders quickly, or at least quickly enough that we’re not aware of it (rather than the opposite case: has our sketchy internet connection died again). We also take it as read that there won’t be weird rendering artifacts, such as the content rendering one way and then immediately rendering in another. After all, we are visiting a web page because we have to accomplish one or more tasks with that page. Our task may be as simple as just reading the article, or it may be that we need to see some list of products, one of which we want to buy, or it may be a login screen.

In this series of posts, I want to explore how to present the content of a web page as quickly as possible to the reader. It’s based on a session I’ve presented at various conferences over the past year, and it’s also been used by others at DevExpress when I’ve been unable to attend.

The first thing to realize is that it’s not necessarily about raw performance – although that has a lot to do with it – but rather it’s about perceived performance. If the web developer was canny enough to present the content you, the reader, needed as fast as possible, but the remaining parts of the page took longer (say, a list of recent posts or similar posts, ads, the tweet stream, whatever), you’d rate the page as a whole as faster than the alternative (that is, get all the data and render only when it was available). The overall time to render the whole page would be the same, give or take, but the reader’s task (read the content) could be started much earlier. The “performance experience”, if I may call it that, is essentially subjective, and not necessarily objective.

As an example: navigate to Amazon.com. As you do so, don’t look at the page, but stare at where the scrollbar will be displayed on the right. Depending on your connection speed, general traffic, etc, you’ll glimpse the main banner displayed on the left out of the corner of your eye, well before the scrollbar gets displayed when the rest of the page is downloaded and renders large enough. An Amazon shopper’s perception then will be that the website is displayed instantaneously, even though the content “below the fold” doesn’t arrive immediately. You could say Amazon’s devs have heightened this perception to the level of performance art.

There have been studies published on web response times showing that taking longer than a few seconds means the user will probably leave and maybe never come back. One such study is Jacob Nielsen’s article on the subject, where he divides response times into orders of magnitude.

• At around 0.1 second, the user feels it’s instantaneous. To quote Nielsen: “The outcome feels like it was caused by the user, not the computer. This level of responsiveness is essential to support the feeling of direct manipulation.” That is, at this order of magnitude, it feels like the browser is directly responding to you, the user.
• At around 1 second, the user is aware that the browser is doing something or that the network is introducing some level of latency, but their train of thought is not broken (and as devs we know how annoying that can be). Quote: “Users [. . .] still feel in control of the overall experience and that they're moving freely rather than waiting on the computer.”
• At 10 seconds, well that’s it: you have pretty much lost the user. Nielsen again: “They start thinking about other things, making it harder to get their brains back on track once the computer finally does respond.” That page I was describing above? I really wouldn’t have bothered had I not been timing it. (And if fact I haven’t visited that overall blog site again since, so it could have the best content in the world, I wouldn’t know.)

Next time, in part two, I’ll talk about how to measure the speed of a webpage and how to use that information to speed up your own webpages. As an amuse-bouche until then, I will consider optimizing three main things: the number of requests for files, the sizes of the files returned, and where they are coming from. Sharding to the rescue! After that I’ll consider the content of the markup and how that can affect your perception of the rendering speed of the page.

Now playing:
Enigma - Mea Culpa
(from MCMXC A.D.)

## The HTML end tag means end of document, or does it?

As anyone who’s ever written an HTML document would surely know, everything apart from the initial DOCTYPE declaration appears in between <html> and </html>. Putting it in XML terms, an HTML document consists of one element, the HTML element. And, as it happens, it has two elements within it: the head and the body. End of story? Well, no; otherwise I wouldn’t be writing this.

Paul Usher (DevExpress tech evangelist extraordinaire) and I were perusing some extremely – can I be blunt here? – crappy ASP.NET MVC code the other day, written by an development outsourcing company that obviously should be doing everything but development, when I came across an whole bunch of <script> tags below the final </html> tag. Wait, what? (And that wasn’t the biggest facepalm in this HTML document: for example Bootstrap, which uses jQuery for its plugins, was being loaded before jQuery. Which led to an error, which led to us reading this code in the first place.)

So I did some searching. I mean it seems pretty obvious to me that the HTML end tag is, well, the end of the HTML document, but maybe I was wrong.

First step is the specifications for HTML at the World Wide Web Consortium (W3C). There under section 8 of the HTML5 spec, we are told:

“Documents must consist of the following parts, in the given order:

1. Optionally, a single "BOM" (U+FEFF) character.
2. Any number of comments and space characters.
3. A DOCTYPE.
4. Any number of comments and space characters.
5. The root element, in the form of an html element.
6. Any number of comments and space characters.”

In other words, after the HTML element itself (and elsewhere in the spec it says that element can only consist of a head element followed by a body element) all that can appear are HTML comments (defined elsewhere in the spec) and space characters (ditto). Certainly no script elements can be there. If I were writing a parser, I could pretty much assume that everything after that closing HTML tag could be ignored. The weird thing is, the browser (in my case Firefox) was reading, loading, and acting on those extra-curricular scripts.

A bit more research turned up a couple of StackOverflow questions about the subject (and believe me it’s hard to know what keywords to search for). The best one I found even came with a recommendation from Google about deferred CSS being put after that </html> tag. A couple of valid points were raised about this practice, the main one being that it’s entirely user-agent specific as to what happens. In other words, there’s no guarantee that the browser you are using will act in the same way as the one your user might be using (across the universe of all possible desktop and mobile browsers). Validation services (such as W3C’s) will certainly label the HTML as being invalid, but a browser will usually bend the rules a bit and try to do the right thing (for some definition of “right”). I have no idea why Google devs of all people would recommend putting anything after </html>.

So there you have it. Don’t put anything after the closing tag for the HTML element. You’re going to basically be crossing your fingers that it’ll be found and parsed and acted on correctly. Just move it up to just before the </body> tag – that’s a whole two statements, if you’re counting – and you’ll be fine.

Now playing:
The Jazzmasters - Down so Low
(from The Jazzmasters 3)

## That time when CSS’ position:fixed didn’t

There’s been an ongoing bug with my blog after I added the “hamburger” menu/options on the left side. In essence, adding it interfered with another “feature” of the individual blog post pages where the title of the post sticks to the top of the browser window as you scroll down through the text. And, yes, you guessed it, both features are provided by JavaScript libraries, different ones, by different people. It’s this week’s edition of JavaScript Libraries Gone Wild!

Let me describe what was happening. The slide-in panel menu always worked perfectly. There were no issues there at all: click on the hamburger, the side panel slides in from the left. Click on it again (or anywhere in the main page), it slides back to become hidden. Brilliant. Just what I wanted. The way the sticky title was supposed to work is that you scroll down through the blog post and when the title reached the top of the browser window, it sticks there and the remaining content just slides underneath as you scroll. If you scroll up again, when the content reaches the point where the title is designed to be, it unsticks and is shown in its designed position in the page. Supposed to. What was actually happening was that, when you got to the point where the title should stick, it simply disappeared. Scroll up again and where it would come unstuck, it magically reappeared. What the…? Time to look at my JavaScript and that of the two libraries.

To be honest, at this point I was reminded of @iamdevloper’s recent tweet:

Maybe I should just discard both libraries and write my own code that does both things. Said every CTO ever, then instructing his web team to write it.

The problem with my investigation was I naturally assumed it was a JavaScript problem. I had two separate libraries, jPanelMenu and Sticky, both of which worked just fine on their own, but somehow, together, the sticky blog post title was anything but. When I first noticed this effect, I wrote some JavaScript to turn off the panel menu once the page had scrolled enough that the hamburger wasn’t visible any more. If it’s not visible, you can’t click on it, right? Since the blog post title always appeared below the hamburger, this workaround worked pretty well.

// ...
makeStickyTitle = function() {
$("#postTitleContainer").sticky({ topSpacing: 0 });$(window).on("scroll", function () {
var scrollTop = $(window).scrollTop(); if (panelMenuActive) { if (scrollTop > hamburgerHeight) { panelMenu.off(); panelMenuActive = false; } } else { if (scrollTop <= hamburgerHeight) { panelMenu.on(); panelMenuActive = true; } } }); }, // ... Well, not completely well. You see turning the panel on or off caused a stutter in the scrolling as the styles for various elements in the DOM were updated and handlers were removed/added and elements were manipulated in various weird ways. You could grab the scrollbar thumb with the mouse and drag it down towards the bottom of the page: it would stop after the panel was disabled or enabled, necessitating that it be grabbed again to complete the action. The sticky title though worked perfectly. Time for some debugging action. First I used, as usual, FireBug. It’s the debugger I’ve used over the years and like it a lot. I turned off my hack scroll event handler, and was able to show that the Sticky code was doing the right thing at the right time by adding the “position: fixed;” style to the title element. However, although the DOM didn’t show anything untoward style-wise, the title just disappeared. When I turned on the scroll event handler and followed through in the debugger, exactly the same style changes produced a visible title element stuck to the top of the window. It was most frustrating. I then switched to Chrome’s debugger. Maybe it’d show a bit more information or some different information about what was going on. And blow me down but it did. You see, if you click on an element in the HTML view, it shows a popup over the element in the preview pane. If the element is not visible there, it also adds a little triangle to that popup to point to where the element is if only you’d scroll that far up or down. And, the “invisible” sticky title wasn’t invisible, it was just positioned at the top of the page, not the window. The “position: fixed;” style wasn’t working properly for some unfathomable reason. Now it was time for Google and some searching. It turns out that it’s a known problem with “position: fixed;” and, get this, the “transform” style. Wait, what? Where did that come in? Yes, you guessed it: the panel menu code was setting a “transform” style on the main page when it was activated (so it could transform the entire page by shifting it right to show the side panel). It turns out that once “position: fixed;” gets used on an element inside a transformed container, the element’s position becomes relative to the container, not the browser window. Eric Meyer has the best explanation for the effect, from four years ago, natch. To quote: …a transformed element creates a containing block even for descendants that have been set to position: fixed. In other words, the containing block for a fixed-position descendant of a transformed element is the transformed element, not the viewport. And that was my problem in a nutshell. The title is an h2 element with a fixed position whose parent’s parent’s … parent is a div with a transform style. Boom! It becomes fixed to the container and not the browser window. Since the top of the container is the top of the page which isn’t visible at that moment, the title just … disappears. And the transform? An in-place shift, essentially; a transform to exactly the same place. It wasn’t doing anything visibly in the non-activated state. At that point, my hack was made easier. Just remove the style when the hamburger wasn’t visible, and add it back again when it was. // ... makeStickyTitle = function() {$("#postTitleContainer").sticky({ topSpacing: 0 });

$(window).on("scroll", function () { var scrollTop =$(window).scrollTop();

if (scrollTop > hamburgerHeight) {
var panel = $(".jPanelMenu-panel"); panelTransform = panel.css("transform"); panel.css("transform", ""); panelMenuActive = false; } } else { if (scrollTop <= hamburgerHeight) {$(".jPanelMenu-panel").css("transform", panelTransform);
}
}
});
},
// ...

Since it was the actions of (de)activating the panel that were causing the scroll hesitation, this code, although still a bit of a hack, does not have the same problem. That’ll do for now, since the next step will be to write my own sliding panel code.

Now playing:
Yello - Rubberbandman [Rubber Mix]
(from Eccentrix Remixes)

## Putting on the Blue Apron

In our house, we’ve divided up what might be called the food duties. I’m the savory cook and Donna the pastry chef. It’s not like we sat down early on in our relationship and threw the dice, I’m just not interested baking cakes, making cookies, rolling out pastry for a fruit pie, whereas Donna is. She on the other hand would way prefer someone else do the meats, the veg, the salads.

This week's recipes

Of course, over the months and years, I’ve got into a rut. Every now and then, I’ll read up on a recipe or see something in a cookery show and ring the changes as it were, but usually I select from a rotation of “recipes” (and I use the word loosely) to cook for our evening meal.

A few months back one of Donna’s friends was talking about a new service she and her her husband were trying out: Blue Apron. This is a home cooking service: they design the recipes with some interesting (and perhaps unknown) ingredients that take about 30-40 minutes to prepare and cook, provide you with the ingredients, pre-measured and counted, and deliver them in a big old box once a week. For a couple like us, it costs $60 a week for three meals, so about$20 a meal, which is roughly what I would spend anyway for us both.

We thought about it, and then after a week of particularly unimaginative meals, I pulled the trigger and ordered it.

The next Friday, the box arrived. Oh man, what a treasure trove! When they say they provide all the ingredients, pre-measured, they really mean it. There were little bags with just the right amount of grated Parmesan cheese, little tubs with (4 tablespoons of) butter and (2 tablespoons of) tomato paste, cute little plastic bottles of red wine vinegar. The meat and the fish were in vacuum packed bags, the vegetables and salads loose in the box. And there at the bottom of the box these two big ice packs (between which the meat and fish were sandwiched). The whole lot was in this silvery, bubble-pack bag, which was itself in the box. Considering this had been travelling two days (they use 2-day delivery from California), the ice packs were still rock solid and the whole box ice cold inside. The only things they expect you to do is provide is salt and pepper and olive oil, and of course put everything in the fridge as soon as possible.

The recipes are printed in full color on card stock (and by being so, basically encouraging you to file them ready for the next time you want to revisit a particular meal). On the first side is a picture of the final dish with a bit of blurb (“[the steaks] pair effortlessly with beautiful purple potatoes and piquant scallions” goes one description), together with a list of the ingredients. On the other side is a six-panel description of the recipe and how to make the dish.

The first step in each recipe is basically the same instruction: prepare the ingredients. Me? Prepare the ingredients before starting to cook? What? I’m a toss it in the pan kind of cook and if I need something I prepare it then and there just before adding it to the pan. Nope, time to grow up and have everything ready in advance in little dishes. (And, yes, after a month of these recipes, I’ve started to do this on those evenings I don’t have a Blue Apron dish to cook.) Similarly, the last panel is all about how to Finish and plate your dish. “Plate” as a verb? OK, I’ll roll with it.

We’re now on our fifth box, so it’s time to ask, was it (is it?) worth it? Overall, I’d have to say, yes. We’ve had some enjoyable tasty meals from Blue Apron, meals that we’d have again (and that I’d be willing to buy the ingredients for and prepare from scratch). There have been a few chef-induced disasters along the way – thankfully not many – of which my Chicken Milanese will go down as the ultimate WTF. (It’s breaded chicken breasts, but after the required cooking time, it was overdone on the outside and still pink in the middle. So I sliced them in two horizontally to cook them some more but the breading went soggy and Donna was very sympathetic when I sobbed during the plating of the resulting gooey mess.) There have been a couple of recipes admittedly that we just didn’t like particularly, but that’s fine too: at least we’re trying new things. The afore-mentioned purple potatoes? Lovely. My rice/beef stuffed poblano peppers? To die for. The squid-ink linguine was tasty, but I felt there was too much of it. And so on.

So, if you’re thinking about trying it, go for it. You can cancel at any time with a week’s notice (ditto, if you’re going away on vacation just let them know a week in advance to skip the delivery while you’re away). I think it’s well worth it for a couple: it gets you to cook and try something different and sometimes that’s all the encouragement you need.

Now playing:
Johnson, Holly - Dancing With No Fear
(from Europa)

## The Auto Warranty sleaze

A month ago, we purchased my wife’s Acura off the lease. She’d done less than 30,000 miles in the three years she’d had it, there was nothing wrong with it, and there wasn’t anything available for the models she liked, in the colors and with the luxury level she was keen on. So rather than worry too much about that elusive new car, we just bought the current one off the lease. Maybe in a couple of years there’ll be something she likes and we’ll consider what to do then. Anyway, this is not about...

## Windows 10 upgrade: the Microsoft Money mess

OK, I get it: I’m behind the times. I still use Microsoft Money, the “sunset” edition . Yes, it’s been six years since it was retired, but I prefer it way, WAY more than Quicken . And, to be honest, thus far – I’ve now been using it for 20 years, believe it or not (first entry: July 3, 1995) – it’s been just fine. However, yesterday, I was suddenly brought up short with a jolt, or to be more accurate with an error message about Internet Explorer 6. It wants IE6??? So what was special about yesterday...

## WOFF files and Azure: the 404 conundrum

More than anything, this is going to be a discussion about testing, but the headline is all.  Lazy tester This afternoon, in trying to keep cool inside on this hot day, I thought I’d remove the Google Ads on this site. Frankly they were a pain in the neck to design for: they used to be a sidepanel on the right and trying to get the code to make them disappear when the browser window was too small width-wise was just annoying. Plus the ads were being loaded anyway even if they weren’t being displayed...

## My new homepage

There’s a TV program on the BBC at the moment that we’re watching called Celebrity Masterchef . In it a bunch of celebrities (90% of which are completely unknown to me) tackle preparing and cooking dishes, competing against each other for the best one, well presented. There’s a couple of jokey judges (who I again have never heard of) to shake their heads in wonderment at the inability of the “celebrity in the street” to actually cook something appetizing. After watching a few of these, we’re getting...

## Open sourcing EZDSL on GitHub

Finally, and I mean FINALLY, I’ve uploaded my old Data Structures Library for Delphi (EZDSL) to GitHub . I’ve been meaning to do it for a while, and now it’s there and you can download it, issue pull requests, the whole nine yards. I also updated it for the Delphi XE series (the last time I ‘officially’ updated it was for Delphi 2009). It includes linked lists (single and double), queues, deques, priority queues, binary trees, binary sort trees, a mildly-broken red-black tree, hash tables, skip lists...

## CST-01: the world’s thinnest watch

OK, this is a weird one. Well over two years ago now, there was a Kickstarter for a watch . Not just any old watch, or even a smart watch, just the thinnest one in the world. It’s 0.8mm thin, or, for you non-metric types, 0.00315 inches. All it does is tell the time. So, I plonked down my \$99 as one of the original backers and pretty much forgot about it, as I do with anything I back on Kickstarter. For those not in the know, Kickstarter is a site that helps people gather money (that is, backers...

# Extras

## Search

I'm Julian M Bucknall, an ex-pat Brit living in Colorado, an atheist, a microbrew enthusiast, a Volvo 1800S owner, a Pet Shop Boys fanboy, a slide rule and HP calculator collector, an amateur photographer, a Altoids muncher.

## DevExpress

I'm Chief Technology Officer at Developer Express, a software company that writes some great controls and tools for .NET and Delphi. I'm responsible for the technology oversight and vision of the company.

October 2015 (2)
SMTWTFS
« Sep
123
45678910
11121314151617
18192021222324
25262728293031