My challenge-within-a-challenge is back, with 12 fresh films to squeeze in to my 2014 viewing.
The odd up-and-down aside, I feel WDYMYHS worked well last year; but for its second outing I wanted to make some changes. Though the top 12 that last year’s simple formula resulted in were all films I definitely needed to see — and several were ones I’d been looking forward to for so long I was actively put off by the level of expectation — I wanted to try something different. Last year’s 12 were, for want of a better word, a little “worthy”: 75% were black & white, 50% were from the 1950s, the most recent was 30 years old… I have nothing against any of those factors individually, but it began to feel rather dominant.
The question was, how to change it while also making the list a ‘random’ selection dictated by Best Of lists, others’ ratings, and the like? Well, it got complicated… but just in case anyone’s interested, I’ll explain it all anyway. Though for the sake of those who don’t care but are nonetheless curious what 12 films the system chucked out, I’ll do my explaining after the list itself. (That said, it’s only in the long explanation that you’ll learn what the string of letters and numbers under each title actually mean.)
So, in the order they were generated (from ‘best’ to ‘not-as-best’), this year’s 12 are:
(All rankings were correct at the time of compiling and may have changed since.)
Now, the long bit:
As you can see, the new selection process has created a fundamentally different set of films. Last year, 50% came from the 1950s and there was nothing from the last 30 years; this year, 50% come from the last 20 years. Last year, 75% of the films were in black & white; this year, 83% are in colour. Last year, three of the films were over three hours long; this year, only two of them even cross the two-hour mark. Even the completely incidental matter of how many I have on Blu-ray and how many on DVD has been turned on its head, with last year’s 7:5 ratio becoming 5:7 this year. About the only thing that remains the same (not identical, but near enough) is the proportion of non-English language films: last year there were three, this year there are two.
Other similarities come in the presence of certain directors: there’s another film each from Chaplin, Hitchcock and Kubrick, all of whom (as you may remember) I had to reject multiple films by last year to meet my “no repetition” rule. In Hitch’s case, it’s the film I would’ve watched in 2013 were it not for my old “Blu-ray trumps DVD” rule; in Chaplin’s case, it was the film of his that ranked second last year; and for Kubrick, it was his third film last year but is now #1 under the new rules. No repeat appearance for Bergman, however, who had multiple entries at the top of last year’s long list, but this time only reached #18.
I’m not short of notable directors among the other nine, however, with a film each from: the Coen brothers, John Ford, David Lynch, Sidney Lumet, and what will be my first encounter with Darren Aronofsky. Depending on your point of view, the remainder don’t stint either: Mel Gibson, Jean-Pierre Jeunet, Park Chan-wook, and Pete Docter Of Pixar.
So, how exactly did I concoct this duodectet of acclaimed classics?
First, a quick reminder of the comparatively simple way I did it last year: I went through IMDb’s Top 250 and the top 250 entries in They Shoot Pictures, Don’t They?’s 1,000 Greatest Films and noted down every film I owned, then eliminated any that weren’t on both lists, then split the difference between their placement on each list to produce some kind of average. Then, allowing only one film per director and allowing films I owned on Blu-ray to earn a place above those I owned on DVD, the top 12 (ultimately culled from the top 18) became my final selections.
That’s far simpler than where we’re going this year.
So, as expressed, I wanted to make the list a little more (shall we say) populist. The best way to do this, I reasoned, was to include more lists. In the end I used five, and they were:
- IMDb’s Top 250, which guarantees a wide viewership and high ranking; it’s often seen as an incredibly mainstream list, but in places (especially a little lower down) it’s less so than you might expect;
- They Shoot Pictures, Don’t They?’s The 1,000 Greatest Films, which is compiled from an extraordinary number of ballots from critics, filmmakers, and more, weighted and analysed to produce a very academic list. To say it strives to be anti-mainstream is unfair, but it’s certainly not concerned with being populist;
- Empire’s 500 Greatest Movies of All Time, which is Empire magazine’s huge poll of readers, journalists and filmmakers from 2008. Much like the IMDb list, it skews mainstream, but even if it’s from “a mainstream film magazine” that’s still “a film magazine”, so the mid- to lower-levels produce interesting films;
- iCheckMovies’ Most Checked, which should see the inclusion of the kind of movies ‘everyone’ has seen but I haven’t;
- All-Time Worldwide Box Office, for essentially the same reason as above. (The version I used is linked to, though it seems to have numerous little differences to the one at my normal go-to site for box office numbers.)
For parity with the IMDb list, all were limited to the top 250 entries. For the record, all positions were collated from the iCheckMovies versions of the lists on 5th January.
As you can see, that’s a list of lists that errs much more toward the mainest of mainstreams than last year’s. However, I’ll repeat my caveats from above: the IMDb and Empire lists aren’t as unrelentingly populist as certain cinephiles would have you believe; and even where they are, I’ve already seen most of those films anyway. Additionally, with so many lists I removed the requirement for films to appear on all of them, which led to the following in my final 12:
- Two films don’t appear on the IMDb Top 250;
- Six films don’t appear in the TSPDT 1000’s top 250;
- Three films don’t appear in the Empire 500’s top 250;
- Six films don’t appear on iCheckMovies’ Most Checked;
- Eleven films don’t appear in the All-Time Worldwide Box Office top 250.
In all, 117 films I own appeared in the top 250 of at least one list, but only 48 of those appeared in the top 250 of two or more lists.
So how do all these lists come together to form my list? I can’t simply split the difference this time! Short answer is, I used a points system. For each list, a film received 251 points minus its position on the list; so the #1 film would get 250 points, the #2 film 249, and so on. If a film was outside the top 250, it scored 0 points for that list.
This produced a chart that was interesting in a number of ways, but one was that it didn’t take account of how many lists a film was on. For instance, The Exorcist appears on four of the five lists, but is quite low on all of them, so its score was 188; The Passion of Joan of Arc, however, only appears on one list, but at #14, so its score was 237. That didn’t seem quite fair. To balance this, I awarded 50 points for every additional list a film was on beyond its first. So, to use the same two films, Joan of Arc got no bonus points, while The Exorcist got 150. These are two of the more extreme examples, but it certainly made huge changes — The Exorcist jumped up literally dozens of places.
I felt some more tweaking was in order. It was all well and good rewarding appearances on multiple lists, but some films were in the upper echelons of one list but just scraping in to another. I decided to weight the results further in the favour of films that were at the top of particular lists. Essentially, this gives a slight edge to the importance of certain lists — which is fine, because I didn’t necessarily want all five lists to be of equal weight. So, 25 bonus points were award for being in: the IMDb top 100, the TSPDT top 50, and the iCM Most Checked top 50. (By this point I was just looking at numbers, so I’ve no idea what actual difference this made to rankings.)
I briefly considered awarding bonus points for an appearance on any list outside of its top 250 — IMDb and iCM Most Checked stop at that number, but the others go on much higher (the size is mostly in their names, but the box office chart goes to 500-and-something too). I was thinking of something like 25 or 50 points, until I realised this would mean a film could get more for being 251st on a list than it could for being 250th, or even 200th potentially. I could’ve raised all the films’ totals by the bonus amount (i.e. instead of scoring 250, #1 would score 300, and so on down), but, to be frank, I couldn’t be bothered.
One final points booster I did add, however, was again from iCheckMovies. That site has many, many official lists for films to appear on, and obviously the more lists it’s on the more acclaimed a film is. So, each film got the number of lists it was on as bonus points — e.g. The Shining appears on 21 lists, so it got 21 points; A Clockwork Orange appears on 29 lists, so it got 29 points — still not enough to reclaim last year’s spot above its Kubrick stablemate, though. In fact, I don’t think this had any impact on the final 12. Although the number of lists they’re on ranges from 14 to 29, at this point those kind of points were’t enough to see any of them booted out, or even rejigged within the 12 itself.
With the final points awarded, all that remained was to institute my other rules. Firstly, no repeat directors — bye bye A Clockwork Orange, which actually finished second overall. I also decided to eliminate Raging Bull — it didn’t feel right it being on the list two years in a row. That had finished third. The next repetition isn’t until #16, a second Chaplin-directed film, but this year that fell beyond the reach of the final 12. I did make one more change, however: I eliminated #14, The Wild Bunch, which would otherwise have been the final film of the 12. Why? Well, this is one that could be contentious…
I say that as if I anyone cares or my rules weren’t arbitrarily cooked up! But what I mean is, there isn’t any rule that counts it out. Yes, with this year’s selection I was aiming for a wide variety of tones, styles, eras, content and so on, and The Wild Bunch is a Western just like the film immediately before it (The Searchers) — but there are plenty of thrillers and a couple of comedies on the list, so why not repeat the Western too? Especially as I get the impression these two aren’t that similar. The real reason, though, is that I wanted to include #15, Blue Velvet. Were I to give the films a personal rating — of “have been waiting to see”-ness, say — the Lynch would come out on top of those two. As they were quite close in points anyway (414 vs 406), I decided to just make the swap, rather than continue to fiddle in the blatant hope of making Blue Velvet’s score rise.
And so, with my underhandedness factored in, I finally had my final 12.
That was fun, wasn’t it?
(The tall picture on the right is the final version of my long list. If you want, you can click here for a legible version, on which you can play “spot the French title spellcheck ‘corrected'”.)
The level of my wit is on full display with the inclusion of “Alfred Hitchcock’s Rear” in the top image. Teeheehee.