jsburbidge: (Default)
In the summer of 1979 or 1980 - I'm not sure which, though diligent research could determine the question - I walked across the river from my parents' house at Trent's Champlain College to Wenjack Theatre to see the original production of Billy Bishop Goes To War.

On this past Saturday I went down to the Distillery District in Toronto, to a theatre not yet built when I saw the original production, to see Grey and Peterson's updated version of the same play. In between it has become an iconic Canadian play (far more so than it's elder sibling[1] 1837: The Farmers' Revolt, also being revived this summer at Shaw) and the perspectives of both audience and actors has changed.

In 1978, when the play was first performed, Canada's most recent combat which had not been incidental to a UN peacekeeping mission had been in the early 1950s. Britain, though post-imperial, was neverthess pre-Thatcher. Although the play was about war, what it spoke about to audiences was more strongly the colonial/imperial dialogue. (There were still plenty of people alive, like my grandparents (all living at the time) who had been grown adults when the Statute of Westminster was passed in 1931.) Now we're more likely to focus on the personal reactions, the skirting/escaping PTSD aspects of getting through that number of high-stress missions.

In parallel, the ageing of the actors has been accompanied by a deliberated transformation of the perspective of the play, a change from a capturing of the young, post-war soldier's autobiography (Winged Warfare, the source for much of the material, dates from 1918) to an old man's reminiscences (or perhaps those of his shade: Peterson is now older than Bishop was when he died). Transitions are more gradual, depictions a little more settled and less energetic; the risk a sense, at the beginning, of memories being dredged up.

(Personally, I think the play's subversive centre of gravity comes towards the very end, in "The Empire Soirée", with the world-weary possibility that all of this was, in the end, irrelevant, a minor bump on the road in the continuing churn of national and imperial powers, the Untergang des Abenländes; that the resignation of the English, willing to have their heroes dead, is a recognition of a zero-sum game populated by elegiac or tragic stories.)

All in all, it's a worthwhile performance.

[1]Grey and Peterson were at Theatre Passe Muraille when they came up with the idea for the play; Salutin's play was an earlier product of TPM.

150 years

Jun. 30th, 2017 11:09 am
jsburbidge: (Default)
 I am nicely old enough to remember the Centennial in 1967, although, as I was seven at the time, my memories of Expo '67, being taught the Canadian Railroad Trilogy and Gimby' s "Canada" in school, and a general mood of Pearsonian optimism are not as precise as they might be.  The overall impression I get from people more of my parents' generation than mine is in agreement with mine: there was rather more enthusiasm for 1967 than for 2017.
 
Admittedly, the U.S. and U.K. are doing their best, these days, to make everyone else look good by comparison. Most Canadians surveyed consider Canada the best place to live, and holdouts would nevertheless consider it well up there in the rankings. (My impression is that immigrants are more enthusiastic; if all your grandparents were born here (like mine) you may tend to be a little more world-weary about it.)
 
Having disposed of Harper and the CPCs, we have a pleasant but not obviously strong prime minister whose government at least says all the right things, rather like Charles II ("We have a pretty witty king, Whose word no man relies on; He never said a foolish thing, Nor ever did a wise one"), albeit with less wit and more playing to the crowd. Not that all is well in the world of politics: we produced that harbinger of Donald Trump, Rob Ford, and there's lots of evidence that we share the discontent (and malcontents) that led to a Trump White House and Brexit.
 
For all the post-Charter changes in Canada, we remain firmly committed to what lawyers call the POGG clause ("peace, order, and good government"), and our leaders, after a daring swerve to a Nobel Laureate and a Jesuit-trained leader with intellectual heft, have reverted to being the spiritual heirs of William Lyon Mackenzie King, or of Bill "bland works" Davis, regardless of their ideological views. (Though I must give a nod to, of all people, Brian Mulroney as an agent of change, between a sensible but much-resented tax reform, a trade realignment leaping back over the years to Laurier's Reciprocity, and an attempted constitutional reform which actually might have been better for the country (if Preston Manning disliked it, it can't have been all bad)).
 
On the cultural front, perhaps the less said about the ingrown standard Can. Lit. crowd, the better. They have changed in alignment since the days of "The Canadian Authors Meet", no longer concerned with "zeal for God and King", but the core commitment to a common orthodoxy remains. There are, however, many happy exceptions, once one steps outside that charmed circle. Among the dead Davies stands out (who was fully aware that he got no respect at home until it was bestowed by critics in New York), along with Munro, Avison, and Scott (worth recalling as poet, constitutional lawyer and writer, and counsel in Roncarelli v. Duplessis), with honorable mention to Montgomery, Klein, and Buchan, tenuously Canadian but still read after a century. (A generation ago, someone I knew defined a Canadian SF writer as someone who had once flown over Canada: Dickson and Van Vogt were not far off. We are doing rather better these days.) 
 
We have produced some top-flight critics and philosophers: notably Frye (who stayed here), Kenner (who did not, but returned late in life to give a set of CBC lectures), Lonergan, and McLuhan, but there are worthy lesser lights (Gordon Teskey, Alexander Leggatt). We have hosted others: Lee Paterson, who taught at U. of T. and organized for the NDP, Jacques Maritain, Étienne Gilson, José Maria Valverde.
 
The past 50 years have seen CBC FM, now Radio 2, trashed by its management and we have shared in the decline of classical studies in the West, but the overall education rate as measured by university participation has increased markedly (whether this is <i>entirely</i> a good thing is another question) and good musical ensembles have, one senses, expanded, although I'm not sure what the trade-off is between the decline of symphonies and the increase in early music ensembles. 
 
Choice in food has gone way up. I recall hunting vainly for shallots in the summer of 1982; they are now a staple in supermarkets. We have craft breweries and artisanal cheese makers and a some decent wineries. It's not quite as good as living in France, but it's much better than it used to be.
 
I find it hard to muster up enthusiasm. We seem to be the country which sums up "damn with faint praise", though perhaps it's the inverse, the recognition that at least we don't have police helicopters bombing our courts or semi-autocrats put in place by a two-centiry out-of-date mechanism for putting a check on the public threatening to make the lives of the poor even more miserable while stirring up strife half a world away. 
 
My ancestors, generally, came here because it was potentially better than staying where they were - a Scottish crofter's son after the Highland Clearances, an impoverished London boy in an early variant of the Barnardo system, a nephew of an uncle who had lucked out as a Sergeant while reducing the fortifications of Louisberg and eventually became an honourary colonel and a member of the Governor's Council, a family of Irish Methodists who had been burned out by the neighboring Presbyterians. There's a whole set of Nova Scotia Settlers who came up from the colonies for land over two centuries ago. (Their ancestors were mainly in the New World because they didn't like the C. of E. Colour me not impressed.) A few UELs, political refugees by today's standards. 
 
With the exception of a couple of German soldiers who had served with the British against the rebels and came in the Berczy settlement, they were all British subjects moving within the bounds of British jurisdiction. (Not the empire: most of them came over before the empire.) With the exception of one great-grandfather, my ancestors were all here, in Upper Canada or Nova Scotia, at the time of Confederation.
 
It doesn't make for a neat story, and there's probably little enthusiasm in it. Some came because things were better, but most because it was merely less bad. My Nova Scotian ancestors opposed Confederation. I have a relative who died at Vimy, and another who served in and lived through the Northwest Rebellion. I have a Minister of the Crown under Macdonald, a Minister of the Crown under Laurier, a Prime Minister and a Premier of Ontario (for the UFOs) to count as blood relatives, though the number of farmers in my Canadian ancestral tree outweighs everyone else.  If you squint hard enough, it matches a narrative of "land of opportunity", but on examination it looks more unusual than typical - both of my grandfathers had university degrees, and my grandmothers both had other forms of post-secondary education, in a period when this was rare in Canada.
 
As far as lived experience goes, I have lived abroad, and while I would definitely not choose the US over Canada, I'd willingly relocate to France given an equivalent job. I'd avoid the UK except for brief visits,despite the fact that, culturally, I'm mainly fairly classically English. 
 
So maybe a cheer and a half for Canada at its century and a half, and a toast in a local IPA or a claret from Thirty Bench vineyards, but a restrained one, in keeping with the old ethos of this Dominion. 
jsburbidge: (Default)
In the run-up to the Sesquicentennial celebrations in Canada, I am hearing a fair bit on the CBC that Canada is a "young country".

This is manifestly untrue. It is older than Germany and Italy; it is older than most Balkan nations (Greece is older), older than most African nations, older than any eastern European nation, with the exception of Russia, and older than many Asian nations. That's because countries' ages aren't based on how long the peoples living there have been there. (If it's used as hidden shorthand for the fact that European descended humans have lived here for only a few hundred years, then it's an act of Aboriginal erasure, in which I'm sure the CBC would not be complicit.)

Britain, France, Spain, Portugal, China, Russia (assuming continuity from the Empire through the USSR to today), the Netherlands, Belgium and Sweden are among the relatively small number of countries older than Canada. If we count "under their current constitution" then France, Russia, and China (at least) come off the list. 

ETA: Canada is younger than many states in the Americas - many if not most states in Central and South America are older, and the US is almost a hundred years older. But I suspect that that comparison is not what is on the minds of the people making the statement.

Capability

May. 30th, 2017 09:12 pm
jsburbidge: (Default)
A few days ago I was looking into some roughly seven or eight year old production C++ code, and saw, basically, the following (the only thing I have changed is the function name):

try {
doSomething();
} catch (...) {
throw;
}

It is, enough to make one despair, because although I have seen worse code - this at least introduces no bugs - I have never seen anything that more clearly shows that the author had no idea what the hell he was doing. (The problem being that the effect is exactly the same as not putting in a try/catch block at all; it shows that the writer failed to understand in any real sense either the mechanisms nor the rationale for the use of exceptions.)

As a piece of cargo-cult programming it belongs with this, also from the same codebase:

~Foo() {
try {
delete m_bar;
} catch(...) {
}
}

Here, the author has been told that exceptions should never be thrown from inside a destructor, and has possibly been exposed to the idiom that surrounds generalised cleanup functions (like close()) with try/catch blocks in destructors to avoid it. He - it was a he, there were no women on the team that produced this - does not seem to have internalised the ability to make the judgement that when you are calling delete, if the class is yours you should be hewing to the above stated standard and if it's not you should drop it like a hot potato if it throws from a destructor. (The member class pointer was to another class in the same library.) There is no need to guard a call to delete with a try / catch block; if it throws an exception you are probably already screwed (corrupted heap, or something equally bad).

The same codebase betrays a failure to grasp polymorphism, const-correctness, the use of initializer lists, the Law of Demeter, design patterns, data hiding, package dependencies, global data use, and so forth. The code base dates from no later than 2008: it might be permissible in code from 1990.

This is massively depressing.

I am no great defender of the modern rush towards credentialism and the move of the universities to become glorified trade schools for their students - my undergraduate university syllabus is now full of departments of humane letters trying to appeal to undergraduates based on usefulness in the world of employment - and my own background has scrupulously avoided any such thing (I write software with an M.A. In English and an LL.B., which I took out of an interest in law as such). But the one thing which having Computer Science courses is supposed to give us is properly trained and capable graduates. And from what I know organizationally these developers must have been at least CS grads, and very possibly Software Engineering grads.

These are not isolated observations. The majority of the "professional" code I see reflects sheer incapacity. The flaws it has are not those of carelessness, which can happen to anyone - nobody writes bug-free code, and even extensive testing and many eyes cannot be guaranteed to flush out all bugs - but of a sheer unwillingness to think about design or structure. Maintainability as an end is tossed to one side (even otherwise "good" developers seem to feel that there is nothing wrong in adding another 25 lines to a function which is already 500 lines long rather than breaking out, at a minimum, the new code and refactoring towards better modularization). Nobody seems to perform dependency analysis. In 25 years I have from time to time seen lip service paid to code reviews but never seen them actually implemented as general practice.

There are good developers out there; I've worked with some and read books by others. But whatever standards general training is supposed to impose are either too low or breached more often than observed.

This sort of complaint can be focussed just on the local area of software development, in which case it tends to become part of the standard debate about software engineering standards in academy and workplace, usually drawing distinctions between the generation of code and, say, building bridges, with the difference in length of existence of disciplines being adduced. However, I suspect that it is one aspect of a more general pattern, a kind of milder Sturgeon's Law. Certainly I have not known a surfeit of good managers, wide variations in teaching skill are prevalent in the schools, and competent politicians seem to be a vanishing breed (probably always vanishing, as the past is seen by the light of nostalgia: I'm not suggesting that things are getting worse, just that they've always been pretty dispiriting).

Some of this is institutionally driven: it really does make more pragmatic sense in many contexts to fill your required roles with what you can reasonably easily get rather than spending seemingly inordinate amounts of time hunting down significantly more able candidates, after a certain threshold has been passed, and to make the roles to be filled have responsibilities and expectations which can be met by the run-of-the-mill rather than to craft roles around the specific skills of the very good. Interchangeable parts apply to human roles as well as physical components. Likewise, society expects universities to turn out reasonably full classes of graduates rather than putting standards high enough that a significant number of students fail, drop out, or take six years to obtain a bachelor's degree. The intersection of the production rules for human capital and the onboarding rules which govern its corporate deployment can quickly lead to positions being filled with less than ideal employees.

That's not to say that it's right, either. I've seen a number of expensive projects which would have been much cheaper, and faster, if those in charge had been willing to pay to get them right the first time rather than be cheap the first time and have to pay more for a rebuild on the second - or third - try. (These have frequently been cases where an "essential" project has budgetary constraints, so a budget is deliberately confected which comes in just under the limits in time and money, and the rest of the time and money predictably become available once the project is further advanced, due to the sunk cost fallacy's domination over the minds of senior managers.)

Some of it is socially driven. Just as our Deweyan schools are built around social promotion for social at least as much as administrative reasons, so also much of our working world is socially built around the idea that most people are reasonably competent. There's a general attitude that encourages a culture of "good enough" when applied to domains which benefit from expertise.

But we, collectively, pay for it. Shoddy work - in infrastructure, education, management, politics - takes a toll on the lives of all of us, frequently indirectly, occasionally directly.
jsburbidge: (Default)
One of the causes bandied about for the current wave of political malaise is the loss of jobs which frequently gets blamed on immigration but on closer examination is generally more a result of the automation of manufacturing and other unskilled labour tasks. There is, however, some evidence that the drivers behind Trump's increase in the working class vote were more to do with identity and change in culture than with having become actually poorer. The two are not incompatible: even keeping up economically in a context where older types of labour are being eliminated on a wholesale basis is to be living in the centre of a storm of change.

We are already seeing political strain as an older (but not really very old) social and political model governing labour and the organization of one's life starts to give under the stresses of the post-modern, internetworked, world.

If I were a politician or a civil servant, this would be at the top of my list of concerns (along with climate change and how to handle the collapse of the petroleum bubble), because we're soon going to move into utterly unprecedented territory (barring some disaster which moves us somewhere unprecedented even faster.)

From an Atlantic article on a study estimating the near future effects of automation:

"Still, the authors estimate that almost all large American metropolitan areas may lose more than 55 percent of their current jobs because of automation in the next two decades."

(https://www.theatlantic.com/business/archive/2017/05/the-parts-of-america-most-susceptible-to-automation/525168/ ).

Even if that's off by 30% (which would mean only 37% of jobs to be lost), it's massive. Even taking into account that some of the lost jobs will be replaced by new ones (probably in low-level IT support), it's massive. Barricades-in-the-streets massive, if you're a politician.

These figures assume, by the way, that society as a whole will remain at least as well off. The job losses result from gains in productivity. (Other unrelated factors may cut into our surpluses - crop failures from climate change, major earthquakes (overdue), costs associated with mass migration - but let's just consider this on its own for a bit.)

The first big issue is the distribution of wealth, requiring a considerable increase in the welfare state and almost certainly a minimum guaranteed income. Along with this problem comes that of preventing profit-taking via rents. (If the general minimum income rises then, absent controls, the amount of available money to extract from tenants rises as well; this is like the one-time jump in house prices which benefitted existing owners when middle-class families switched from typically being based on one income to bring based on two.) Purely practical considerations will require a far more economically interventionist government.

Secondly, a shift in how people define who they are becomes critical. A very few generations ago, everyone wanted to be leisured, and even the working classes aimed at as much leisure as possible (work for the minimum needed to live and take the rest of the week off, essentially). It took much indoctrination via advertising to create our society of workers who want so much they work as hard as they can. In a world where much of what many people can "do" will be on a volunteer basis, or very, very part-time this sort of self-identification through work fails in many cases.

The assumption that everyone defines themself by what they do is much older than the Twentieth Century, but it was limited to the Second and Third Estates ("knight", "gentleman", and "baron" weren't what one does so much as pure statuses), which is why we have so many last names like Smith, Webster, and Farmer. (It's actually a bit more complex with those names, as they originated as nicknames to distinguish this Thomas from that Thomas, and the range of differentiators went well beyond what one did. For that matter, Thomas is a pure nickname itself, being the Aramaic for "twin". It's still true that identifiers like "Nick Bottom the Weaver" go back well before the modern period.)

If large swathes of the population become either jobless or involuntarily displaced into another job because the jobs they once had no longer exist as a category there will be even more reaction against, to put it generally, changing times. Deracinated voters may not move in the direction of Trumpism or the easy xenophobia represented by Brexit, but they are unlikely to continue to back a centrist, don't-rock-the-boat-too-much, "standard" political party, unless the polities within which they live are visibly taking some sort of realistic action to deal with the shifts.

One corollary of this is that we should worry not so much about Trump and May but what comes after, when their promises fail to come through for their followers and things continue to get worse (from that point of view).
jsburbidge: (Default)
From Madeline Ashby's Company Town: "RoFo was a sub-persona deployed by the Urban Tactics office to create an evolving portfolio of tasks based on residents' complaints. You just pinged RoFo, and complained about any damn thing you could think of. A crack in the wall. A clogged drain. ... It didn't mean the problem would get fixed right away, but it did mean you'd been listened to."
jsburbidge: (Default)
By coincidence of finding them in different places second-hand, I just finished Tolkien's Finn and Hengest back to back with Sisam's Studies in Old English Literature.

Together with some other texts (I also read Matthew Wright's The Lost Plays of Greek Tragedy recently as well), they set me thinking about the cultural erasure which occurred between the mid Eleventh Century and the Sixteenth, and more generally about loss associated with both classical and old Germanic cultures.

Most of this will be about history or fiction, but let me start with something smaller: a name. The odds are good that, unless you're an Anglo-Saxonist, you've never run into the name Wynfrith; although the man with that name is better known than, say, St. Wilfrid or St. Edmund Martyr, his principal cultus was on the Continent, and he is referenced in the histories as Boniface - a calqued Latin translation of his name - even in English.

Before 1066, of course, this was not the case. In letters translated from Latin into late Old English in about the Tenth Century, the Latin form (which would have been original in the Latin letters) is reverted to its native English form.

A similar but less drastic Latinization means that most readers of The Lord of The Rings, even those who have taken a history of the British Isles from Caesar onward, will not connect The Mark with the Anglian form underlying the Latin Mercia.

With a very few small exceptions such as Caedmon's hymn we have only one MS witness for any given poem in the surviving Old English poetic corpus. Worse, they are typically poor witnesses; they record texts the majority of which may have been written in a different dialect and certainly a couple of centuries - at least - before the MS date, in a period in which the form of writing itself was in flux, and the scribes - the latest ones, at least - were inexpert in poetic vocabulary and metre.

The poems themselves were written for an audience which understood a set of conventions, and a literary and historical background, to which we have only the barest access. They are full of hapax legomena words or entire expressions. Beowulf is almost composed of interpretative cruxes.

The matériel of the OE poetic corpus was once a coherent body on which poets could draw, confident in their audience's recognition. Its preservation had already come under pressure from a high culture which looked askance at old pagan or heroic stories (Quid Hinieldus cum Christo?); after Hastings the changeover in patrons meant its eclipse. The nobility were not interested in the old stories any more. (The next substantial poem in alliterative verse we have, the Brut, only a century later, follows the Matter of Britain and a Norman French original in Wace.)

Classical literature, honestly, isn't a lot better, though it has some advantages. Although some works survive with one witness - Catullus' Carmina survived in one MS when it was printed - most of the core works have multiple witnesses - but except in the very richest areas (Virgil, for example) the witnesses we have frequently go back to a common source at some remove from the original. (For any Classical Greek texts, pretty much by definition, our MSS reflect the edited texts if the Alexandrian scholiasts, which is good in one way - we inherit their scholarship and sometimes some of their editorial apparatus - but bad in another, as we have a hard time getting behind the homogenized texts to earlier versions[1].)

What we do have is a continuous tradition of interpretation for (some of) the major texts we have. (There were effectively no Anglo Saxon studies between the late 11th Century and the mid 16th Century[2], and it was only in the early 18th century that even a useful grammar was produced.) Still, even Virgil has numerous cruxes which have bothered interpreters since at least the late Classical period.

Most of Greek literature is irretrievable, barring a miracle from Herculaneum or the like. (The sole play by Menander of which we have the full text was unearthed from the sands of Egyptian in the mid-20th Century.) The same is true of most of the Latin Literature Virgil knew; we have only fragments of Ennius.

We grow up thinking we know Roman history, but even it looks more uncertain the more you know about the details. We have Cicero, Caesar, Sallust and Tacitus, but they all have agendas which makes them biased witnesses. We lack chunks of Livy, because he ceased to be popular in the late Roman Empire - complete sets of his history were already rare in the Fifth Century - and people didn't bother copying him. And we have none of the variant histories which qualified Livy.

In a manuscript culture nobody has to suppress unwanted texts. Texts in which nobody is particularly interested don't get copied into the future. It's not that anyone was hostile to Agathon, or Asinius Pollo, or the vast quantity of Anglo-Saxon writers; they just had other things to do.

And the texts we do have came out of a different world.

Tolkien considered it likely that the Hengest of the Finn episode in Beowulf is the Hengest of Bede's history; other historians have considered Hengest and Horsa to be entirely mythical. Either way, the whole fabric of historical and allusory background that informed, say, Alfred, has dropped out from around the texts we have, leaving us to scrabble for the bits of what Tom Shippey has called "asterisk-history" to piece together what we can.

The same is true, although there is a difference in degree, for the classical world. We have Cicero's letters, but we have to reconstruct the laws of the late Republic under which he litigated his cases; Virgil (probably) knew Etruscan; even the records of awarded triumphs hide much that we cannot now know. Even for very well documented figures like St. Augustine, random discoveries (such as the Divjak letters) can change our understanding a great deal.

Of course, context drops out all the time, even for far more recent cases. Much of Elizabethan drama never made it to the printing-press; we have no text of Kyd's Hamlet, or Shakespeare's Cardenio. Arguments fly over exactly how the Tudor Reformation was greeted on the ground because for all that we have Roper and More and the Petition of Right and the Protestant controversialists there was a strict censorship with the Star Chamber behind it to prevent a general and accurate landscape of reaction from emerging into the records.

This is part of why I distrust and dislike translations.

It is not simply the interlingual gap, the "traddutore traditore", which is present for even contemporary texts. (Eliot's "Anabasis" cannot be Perse's "Anabase", no matter how hard it tries.) It is the way in which the translator's need to resolve cruxes hides them, the false veneer of certainty when a reader takes the translation as standing for the original. I make two exceptions: cribs (which (a) are meant to be read with the original and (b) frequently preserve cruxes with notes referring back to their complexity in the text) and translations which make no pretence to represent the original closely (some of Pound comes to mind) and have to be evaluated as works on their own with the original as a pattern at best.

[1]The same problem applies in spades to the Masoretic text of the Tanach, as the Masoretes had no interest in recording variants, and our main witness to alternative earlier forms is the Septuagint (plus some later Targums). This is one area where the Dead Sea Scrolls were greatly of use.

[2]We owe the rebirth of Old English studies partly to the fact that in the wake of the English Reformation some people thought that they could uncover a pure pre-RC English Christianity by reading AElfric and Wulfstan. (They were wrong, but they ended up republishing some important prose sources and collecting and preserving MSS which turned out to be important as poetry sources.)

There's also a sudden blossoming of interest in pre-Conquest history in poetry and drama, whence Lear.

First post

Apr. 5th, 2017 08:43 pm
jsburbidge: (Default)
Another refugee from LiveJournal. I am currently leaving my LJ account in place and not transferring older posts.

Profile

jsburbidge: (Default)
jsburbidge

July 2017

S M T W T F S
      1
2345678
9101112131415
16171819202122
23242526272829
3031     

Syndicate

RSS Atom

Style Credit

Expand Cut Tags

No cut tags
Page generated Jul. 27th, 2017 04:29 pm
Powered by Dreamwidth Studios