jsburbidge: (Default)
 In journalism, the use of the passive voice, usually discouraged elsewhere stylistically, seems to be endemic in headlines.
 
The problem is that the impact of the headline becomes very different when the agent is omitted. The CBC has a headline: "Carney attacked for wanting 'free ride,' 'hiding' from public amid latest campaign break". It would leave a different impression if it said "Leaders of the CPC and Bloc attack Carney for wanting 'free ride,' 'hiding' from public amid latest campaign break", which is in fact what the article is about.

Election

Apr. 8th, 2025 07:02 pm
jsburbidge: (Default)
This may be the most remarkable election since 1968. Certainly it is by the current numbers. (I vaguely remember the 1968 Liberal convention coverage. I certainly remember Trudeaumania.)
 
Two weeks and a bit into a campaign, before the debates, feels like being early to call a result. But it may be worthwhile, cautiously, to point out certain things:
 
1) That Liberal majority in the polls does not (much) result from a Red Tory Carney picking up votes from the left edge of the CPC (non-ML). It seems to be what happens when a majority of normally NDP voters decide that blocking Poilièvre at all costs is preferable to the alternative. (The fact that Jagmeet Singh is not necessarily popular with true progressives doesn't hurt either.) This means that there's little chance of the Conservative campaign changing many minds: the CPC is generally holding its core voters but cratering nevertheless. (By the same token, the ability to get large rallies out of CPC supporters will benefit them nothing, other than perhaps revving up the canvassers they need (Jenni Byrne is supposed to be good at managing "the ground game"), as it doesn't expand their support. If anything, by being Trumpy in style, it might reduce their potential support.)
 
It helps the Liberals that when Carney can go all Prime Ministerial, i.e. when he has to "break from campaigning" to deal with Trump he sounds genuine, serious, and positive. Some commentators are throwing around words like "Churchillian", though that may be going a bit far.
 
2) The fact that Carney is visibly uncomfortable with campaigning may actually be to his advantage among people who are tired of "politicians" but just want decent government.
 
3) The split between the Ontario (and Maritime: let us not forget Peter MacKay and his legacy) and Western wings emerging into the daylight is in no way good for the CPC. It sort of makes the election start to look like the latter parts of the fight between King Arthur and the Black Knight. ("Only a flesh wound").
 
4) In theory, the Liberals could still slip up badly, especially in the debates. But given the underlying dynamics, it would take a really impressive disaster to make a lot of the people who have indicated they support the Liberals in this election to stay at home and risk a win by Poilievre.
 
The debates are likely to be a stark contrast: on one side, an experienced attack dog whose key election lines are all negative[1] and in the other a very much not-a-politician whose core messages all fall ino the two buckets of "positive" and "bracing". I suspect that viewers will largely take away what they came with.
 
[1]Aside from a lot of tax cuts. When faced with a crisis, what else can small-government conservatives do?
 
5) Finally, there's the loose cannon of Danielle Smith. She plays to her supporters; her local support is strengthened by being seen as anti-Ottawa and relatively pro-American. But in the key areas of Ontario and Quebec it just puts most people's backs up, including a fair number of PC voters. (There's a swathe of Doug Ford supporters who dislike Smith, rather like Carney, and don't mind the idea of a Liberal PM with extensive financial and business experience. They might not vote for Carney, but if they don't they are liable to stay home.) And Poilievre will not, possibly cannot, condemn her univocally and strongly. Her behaviour may not shift many votes, but it is certainly likely to confirm anti-CPC voters in their views.
 
So one can be somewhat hopeful that at least, with the whole world going to rack and ruin, we may get our best shot to minimize the damage here at home.
jsburbidge: (Default)
Paul Krugman, today on Substack, talking about voters blaming governments for conditions (specifically inflation) out of any individual country's control: "The race is not to the swift, nor the battle to the strong, neither yet electoral victory to parties with good policies; but time and chance happeneth to them all."

Koheleth is always apposite in some way, especially in the Authorized Version, or maybe the Vulgate. (Vanitas vanitatum et omnia vanitas.)

This touches on my reactions to posts elsewhere about people who have never heard of the Odyssey. Although there is actually no obvious reason, given today's education system, that one should have run across Homer at any time during elementary or high school, it enriches one's experience to have read the Nekuia, or the recognition of Odysseus by means of an old scar (a passage chosen for discussion in Auerbach's Mimesis), or the destruction of the suitors, or indeed almost any other passage. (Plus it's a foundation for reading other texts.)

(My daughter, who has three years of Latin and also has a Greek Myths component in her English curriculum but clearly only a glancing familiarity with Homer, called me up a few weeks ago asking about the Odyssey. I told her to read it in a decent prose translation. She asked if she could borrow my copy, and I told her that it wouldn't do her much good, as it starts with ἄνδρα μοι ἔννεπε μοῦσα...)

jsburbidge: (Default)
Freeland is painful, Carney awkward, Gould sounded as though she could actually survive on the streets of a French city (she did her B.A. at McGill), and Baylis sounded as though he has (he better have, as he was born in Montreal, even if he is an Anglo), although he was not displaying full formal facility with formal eloquent standard French, being rather more colloquial. I think that we deserve an Anglophone leader who speaks French as well as Lucien Bouchard spoke English.
jsburbidge: (Default)
1) A month or two ago, I was looking for an adequate container for flour. If you buy 5 kg bags of flour and up (best value, but a little excessive unless you bake really quite a lot) your best bet is a plastic garbage pail, but below that level a container that will fit on a shelf is a reasonable hope.
 
Flour in Canada is sold in 2.5 kg bags.
 
The containers available, all made in the US, are for five pound bags of flour. Those of us old enough to remember the Imperial System will recall that one kilo is 2.2 pounds, so we get five and a half pounds. The flour containers will not hold the amount of flour one buys.
 
I was somewhat scathing to the person at the store. One can get containers, not meant for flour as such, which are larger still; but it is rather pointless to offer for sale containers not really fit for their advertised use.
 
2) On the other hand, they can't get the old system right either. I have a number of older English cookbooks, mainly Penguins, which predate the adoption of metric.
 
I find it nigh impossible to find proper liquid measures. Ignoring the fact that a proper pint is 20 ounces (and a quart 40 ounces) all I can find is inferior US products which mislabel them as 16 and 32 ounces, respectively.

Ship Money

Feb. 2nd, 2025 10:12 am
jsburbidge: (Default)
 In the run-up to the Civil War, Charles I levied money via a mechanism involving taxation to support the building and outfitting of ships. It required no parliamentary consent, and was challenged in court; Charles won.
 
The parliamentary party claimed that the levy violated a principle that the Crown required the consent of Parliament to levy taxes. If you read Coke, or the Whig writers who follow him, you will gather that this is correct, and that there were ancient liberties going back to very early days of which this was one. By this argument, the parliament of the Petition of Right and the Long Parliament were merely defending an ancient constitution which the Crown was assaulting.
 
With better scholarship and more disinterested scholars, it's now fairly clear that Coke and the parliamentarians were wrong all along the way. It would be more accurate to say that Parliament had begun to take the bit between its teeth and extend its powers under Elizabeth (partly driven by economic, partly by religious changes) but that the Tudors in general had the personal prestige needed to keep these trends in check. With the accession of the Stuarts to the throne, these tensions became public, and the slide toward what would become the Civil War began.
 
The question of the legality of Charles' levies is now a dead issue: for all of the pretences of continuity, England has had two revolutionary resets to the fundamental principles governing the relation of Crown and Parliament (1642-1660, 1688, finally settled for good in 1745) and appeals to anything before the Restoration are pure antiquarianism.
 
The founders of the United States were mainly Whigs - a very few figures with more traditionslist views were among them, but most such colonists were Tories/Loyalists in the Revolution. They took as gospel the principle that the legislature alone had the power to levy taxes, and wrote it into their constitutions.
 
There has been some erosion of this over the centuries (both in the US and elsewhere) by the growth of "secondary legislation" (i.e. regulations) where the legislature provides a framework but the executive can set details by direct regulation. This the legislature can, for example, establish a tax but allow the executive to set the rates. The same applies to measures which have a secondary effect of bringing money into the fisc (e.g. fines) but which are not primarily motivated by that goal.
 
The ability of the executive to set tariffs in an emergency is one such exception. It allows action to protect a national interest from economic threat without going through a lengthy process of legislation.
 
The tariffs levied by Trump claim to be allowed by this exception. However, given both the facts on the ground - it's hard to argue that any such emergency exists - and Trump's own statements elsewhere, it's clear that Trump wants tariffs because they raise money[1]. That is, he is performing an end run around the principle that revenues are to be raised only by the legislature using the declared "emergency" as a fig leaf to cover the real reasons 
 
[1]In his view, from foreign countries; more realistically, from domestic importers and consumers.
 
This is actually, from an American perspective, a more serious issue than that of the economic dislocation caused by the tariffs. Like the attack on birthright citizenship, or the attempts to impound funding flows authorized by Congress, or the attempt to buy out federal workers en masse without authorization for the expenses or to remove Inspectors General with no notice or reasons being provided, this is an attempt to arrogate to the Executive Branch powers granted neither by the Constitution nor by explicit acts of Congress[2]. The US is now in the middle of a constitutional crisis of a scale not seen since the American Civil War.
 
[2]The tariffs are levied under an Act of Congress, but the claim that an emergency may be declared arbitrarily and with no evidence to trigger the condition goes well beyond the legislation. I expect that when this is challenged in court the Administration will claim that the determination is not subject to review by the courts.


 

Trade Wars

Feb. 1st, 2025 10:10 pm
jsburbidge: (Default)
 I was in an LCBO yesterday afternoon and was browsing beers when I saw an employee busy stocking shelves with an American beer - possibly Michelob or Old Milwaukee. My first thought was "that's not going to last long" but then I realized that almost all the beer by the American big brewers sold in Ontario are brewed and bottled in Ontario. The US beers that would be affected by Ford pulling US products from LCBO shelves would mainly be craft brews, or craft adjacent. (Samuel Adams and Sierra Nevada are imported, but Budweiser and Coors are Ontario made. (Goose Island is from Quebec.)) Vice-versa, Molson is Molson-Coors these days, and the headquarters is in the US, so Molson beers are in the same category. (Labatt's is technically Belgian, as part of AB InBev.) (Their product is crap as well, but for now I'm avoiding talking about quality.)
 
Which raises a question. There are plenty of "American" products produced in Canada by wholly-owned subsidiaries. Sometimes this dates back to pre-NAFTA times and has merely continued and in some cases it involves simply being easier to manage fairly large volumes by regional production. (Heinz Ketchup took out advertisements a week or so ago to point out that their Canadian ketchup is made in Canada. So is Coca-Cola.)
 
If the aim of avoiding purchasing American in favour of Canadian products is based on immediate flows of money, the purchase of Coke or Heinz or, for that matter, Molson Canadian is sending much of the money to Canadian workers and Canadian suppliers to those companies[1], but there is still a flow of profits to the US parent. A real "buy Canadian" campaign aimed at pressuring American business interests means buying local, and typically from smaller producers. (And more expensive ones, typically: cheap cat food comes from American sources like Purina, and the Canadian brands like Acana/Orijen are among a higher-priced set of products.)
 
[1]Inputs are another matter. Craft beers, for example, are made with a wide variety of hops, some of which have a single source - so even though they are usually guaranteed to be fermented, bottled, and sold locally they usually have by definition an international aspect.
 
Sometimes you can't tell where something comes from. Many products made for Loblaws or other grocers' in-store brands merely say "made for" and gives the grocer's name but not the place of manufacture, or who did the manufacturing. Blue Label Peanut Butter says"Processed in Canada" but that leaves open the possibility that the raw materials could come from anywhere. Their Water Crackers have no source - it just gives Loblaws' address, not the manufacturer's.
 
This is not specific to own brand labels: neither PC nor Classico sauces have a "Made in" statement. But it it a reasonable though not certain inference that a product from a Canadian manufacturer is likely to be sourced in Canada, but no such inference can be drawn from a brand of a retailer, which sells products from all over the world.
 
If you want to buy Canadian, your best bet aside from really diligent research is to buy from smaller specialist stores, more likely to be locally owned; to buy not only "Canadian" but local (sometimes from non-local chains: Whole Foods has a policy of sourcing from and highlighting local products[2]); and to be ready to pay more than a baseline amount for the products in question, except for agricultural goods in season, where local will tend to be cheaper.
 
[2]There's a small dilemma: Whole Foods is better along a whole set of axes (labour, ethical sourcing) than Loblaws or some of its other competitors[3], but it is emphatically US-based.
 
[3]The local Whole Foods is close to a Longo's: this is a local chain which started as an Italian immigrant grocery store and grew. It is partly-owned by the founding family and partly by the chain which owns Sobeys; it's essentially a competitor in Loblaws' space, i.e. neither discount nor luxury. There are a number of brand-name products carried by both Longo's and Whole Foods. It is my experience that these are almost always cheaper at Whole Foods. They have a reputation of being expensive because they don't carry cheap food, but their markups do nut seem to be exceptionally high.
 
The perspective changes if we shift to "boycott US" instead. Then we can look at goods from elsewhere - which is arguably what we should be doing: strengthening ties with non-US trading partners. For the next four years, we're all in this together.
 
-----
 
On the political front, it's clear that Trump wants tariffs simply because he likes tariffs, and that his pointing to the (small, apparently) traffic in fentanyl across the border is a veil over his dislike of the US trade deficit with Canada. The main check on Trump (other than the real but not certain possibility that his action will be found to be illegal - there are several reasons that this is arguably ultra vires) is that the markets, which had previously been treating the tariff threat as a negotiating tactic, will react badly enough that Trump pulls back. He does pay attention to the stock markets.
 
On our side of the border, the current headlines talking of a "possible trade war" understate it: my estimate is that a leader who did not retaliate in what was perceived to be a strong manner would be severely punished in public opinion, at the metaphorical level of being strung up on a lamppost - and there are elections coming up. Neither Ford nor Trudeau can be seen as neglecting to hit back.
 

Editions

Jan. 24th, 2025 07:58 pm
jsburbidge: (Default)
 (This is a bit of a ramble; there's no grand argument here, more like some free association,)
 
In 1980, I was introduced to Piers Plowman in the form of the Clarendon Mediaeval and Tudor Series edition of the first part (i.e. that part corresponding to the A-Text) of the B-text. It was essentially a light reworking by J.A.W. Bennett of Skeat's edition of the late 19th Century, with notes added for students. This was in the context of a course focussed mainly on Chaucer, but I was interested enough in Langland to have written a paper on Piers, but I cannot recall what it said, except that it referenced the passage on the harrowing of hell.
 
A year later, I was introduced to some of[1] the fighting over modern textual criticism when I took a course in codicology in graduate school under Lee Patterson, in the form of the arguments over the Kane/Donaldson edition of the B-text. (Patterson wrote an article on the issues around the edition, "The Logic of Textual Criticism and the Way of Genius: The Kane-Donaldson Piers Plowman in Historical Perspective", reprinted in Patterson's Negotiating the Past, at about that time.) The editors (put very briefly) identified such a large degree of convergent variation in the editions of the A-Text and B-Text that classic stemmatics became impossible; editing had to be locus by locus. (There has not been universal agreement: Charlotte Brewer in particular was vocal in dismissing the approach. Given that her own analysis has led to the identification of an earlier Z-text which is even more dubious, I'm inclined to agree with Kane.)
 
[1]The other big argument at that time was over the Gabler Ulysses. That has never really settled down - there's a standard paperback aimed at the academic market using the Gabler text but there are also emphatic holdouts. The Folio edition of Ulysses uses the older text.
 
About a year after that I purchased a complete Piers Plowman; the EETS edition of the Skeat B-text (second-hand, at Thornton's, in Oxford). This was my reading copy for a good number of years. The Bennett edition was better as far as it went, but dropped about two-thirds of the poem.
 
The critical edition by the Athlone Press - that is, the George Kane versions, completed in 1999 by a C-text version by Kane and Russell - seemed to have remarkably little effect on what was broadly read. The copies of PP I ran across from time to time in second-hand stores were all based on Skeat - either the EETS version, or the short version by Bennett - and that seemed to reflect what students were reading. The Knott/Fowler edition of the A-Text showed up once (to be grabbed immediately) but the A-Text is really a version for somebody who has already become interested in the poem and wants to see the other versions.
 
(For some reason, the C-text shows up rarely to not at all second-hand. The student versions were of the B-text until Pearsall's student edition of the C-text in 1978 - but I've never seen the Pearsall in the wild, so to speak, although it has been reissued twice, mist recently in 2008, so it's clearly in use.)
 
A couple of years ago I found the parallel-text edition by Skeat at the Trinity College Book Sale - a career academic had retired and I was able to also acquire the EETS Gower and a few other Middle English texts as well. This gave me a second copy of the Skeat B-text plus a parallel A-Text and C-text, completing a collection of all three texts after a little over 40 years.
 
Then, about three months ago, I ran across the Kane edition of the A-Text in a local second-hand bookshop at a decent price. Like all of the Athlone Press versions, it has a long introduction (all about textual editorial principles and the evidence of the MSS) and full textual apparatus. It also happens to be a good reading text[2] with clear type, good page design, and overall very pleasing aesthetics. A check on AbeBooks also indicated that I could get a copy of the Kane/Donaldson B-Text reasonably cheaply, so I ordered it; it arrived late last year on the first day the Post office was back in operation again. It has an even longer preface (which I had read over forty years before) and is an equally good reading copy.[3]. The only problem with the Athlone versions is that they're not necessarily what you want to take on transit; they're hefty hardcovers with about half their pages being given over to the prefaces.
 
[2]If you know Middle English. There's an extensive apparatus, but it's all textual variants. The Bennett and Pearsall editions would be what to use if you need more glosses and/or context.
 
[3]Some scholarly editions are run-of-the-mill books. Some show the signs of excellent design, and manage to fill both the demands of scholarly documentation and the reader's experience quite well. Another good example is the standard edition of Tristram Shandy, which is a lovely reading copy. (You can get the text of that edition as that in the newer Penguin Classics edition but, again, with a different kind of apparatus.)
 
All this raises the general question: in a case where there is a choice of editions of a text, which one should one choose, and why? (It's worse with Shakespeare. Just about every major edition of Hamlet differs from all the others; the degree of variance is less than with PP, but the number of choices is much greater.)
 
In some cases the key factor is simply expense and availability. To take a simple example of a popular novel: the definitive edition of Jane Austen remains that of R.W. Chapman, and the original books are lovely artifacts. It's one of the last scholarly books I know of printed using catchwords. (And you can get cloth bound volumes second hand for about 40 dollars sometimes.) It remains in print; a current new copy in paperback of one volume is about a hundred dollars. The original was also published in Morocco leather: the full set of five volumes runs about 5,500 (USD) second-hand. But the text itself, minus the apparatus and secondary material, is in the Oxford Illustrated Jane Austen, which is much cheaper (especially second hand). The Folio editions are about 80 USD per volume. If you just want to read the text, regardless of format, Project Gutenberg has free texts based on Victorian editions, and a Penguin Classics text is a cheap but reliable version (largely based on Chapman) and is available second-hand for about five dollars.
 
All of which resolves itself to essentially five choices:
 
1) Just want a reading copy, don't care about text, willing to read online: Gutenberg
2) Want a cheap copy to read in hardcopy: Penguin Classics or equivalent (10-20, depending on second hand or new).
3) Want a reliable copy with full editorial detail: from 40-90 dollars per volume, depending on Second Hand (HC) or new (Trade PB).
4) Want a very good hardcover reading copy: from 40 to about 100 (Second Hand Chapman/OUP to Folio).
5) Want the best version available on all counts: 1,100 per volume (Antiquarian leather-bound Chapman edition)
 
Most readers will fall into categories (1) and (2). (3) is pricey because there's a severe dropoff in demand numbers which affects the economics of publishing really scholarly editions in bulk. (4) is essentially a luxury market. In this case it happens to overlap with the scholarly market, but this market exists for non-scholarly books also. (5) is an extreme form of the luxury market: if you had money to throw away you could treat these as nice reading copies but they fall more into the "collector" space. (They're what Folio pretends to be.)
 
There are few variants in the text itself, though: the only authoritative source is the first edition, with a couple of corrections which may or may not be made following Chapman.
 
At the other extreme, consider Joyce's Ulysses. You can get reasonably cheap paperbacks of both the older text and the Gabler text, and there's an unresolved war over the superiority of one over the other. (Older texts, I should say, because there's more than one.) I have a second-hand copy of the paperback of the Gabler text aimed at students and the general reader which cost about 20 dollars or less. The old Bodley Head edition is a very nice reading copy, at 20 to 30 dollars second-hand. At a slightly higher end I was able to pick up the 1999 Folio edition (which is very emphatic about not being the Gabler text) for about 50 dollars. But all of those are simply the text itself. For the apparatus, the three volume Gabler edition with full printing of variants is 750 for three volumes, second hand. That's not that unusual with large works for which the primary market is libraries: the Frankel Agamemnon is 500 dollars, second hand (also three volumes). (By comparison, you can get the very respectable Denniston and Page edition for about 30 dollars.)
 
In the end, this little associative tour may be more about markets than editions.
 
The collector's market, the one with $5,500 Jane Austen sets or the edition of the Allen Oxford Classical Texts Homer in calf leather and onionskin[4] isn't really a market in an economist's sense of the word. There's no mechanism for setting an agreed-on value. 
 
[4]This was a real thing. I saw it once in a library copy, and it's a lovely piece of work, and belongs to a vanished world.
 
The market for the Folio Society isn't really a collector's market, although I'm sure some people collect Folio editions the way some people collect Foulis Press editions, at a lower cost. It's a market for general readers with lots of money who see themselves as book fanciers. (They have shifted away from publishing editions of the classics to publishing the entire Dune series, Marvel comic collections, and Le Carré. I don't think their choices are poor from a marketing perspective, but it declares their market in a way that editions of Trollope, Austen, and Gibbon don't.)
 
The academic market, like the professional market which I knew on the other side when I was a legal editor, is one characterized by high costs - accuracy and reasonable usability are important - and small audiences, usually libraries and a few dedicated professionals. From a publisher's point of view, unless you are publishing a book which might be put on undergraduate courses, your market is little larger than that which might have been before an eighteenth-century publisher: a set of libraries, plus a smaller number of individuals with the means and interest to purchase your product.
 
By comparison, the general market is vast. If only one in a thousand people are interested in buying a paperback copy of Clarissa[5], well, that makes forry thousand some potential customers in Canada, two hundred thousand in the United States, and maybe seventy thousand in the UK and Europe. Thus, modernizations of Piers Plowman are vastly cheaper than editions of the original text. This also has a bearing on why, although in general the theory of academic editions relies on copy-text for accidentals, editions of Shakespeare tend to have modernized language: it increases their market many-fold.
 
[5]I'm not. I read and enjoyed Pamela, but my life is likely to run out before I finish all the books which are in a notional queue before Clarissa.
 
Commercial authors in the serious midlist area can hope to do rather better[6]. Commercial bestsellers get to maybe one percent of the American public at best but the economies of scale are such that deep discounting still provides massive profits to the publisher and the author.
 
[6]Better than Piers Plowman modernizations. Doing better than Pride and Prejudice is a different category of challenge.
jsburbidge: (Default)
 ... which Trump couldn't reverse:

Pardon everybody Trump has mentioned going after for personal or partisan reasons. (His relatives, Harris, Cheney, etc.).

There's precedent for broad pardons for "anything done under the term of ..." (more monarchical than Presidential, but there's continuity there).

Mist things a president can do by executive order can be reversed by another executive order. Pardons are not in that category.
jsburbidge: (Default)
Religion and Literature in Western England, 600-800 by Patrick Sims-Williams

Building Anglo-Saxon England by John Blair

Menewood by Nicola Griffith


One of these books is, obviously, not like the others.

The Sims-Williams book covers, in detail, what can be known about an area relatively close to what later would be the Welsh Marches (Hwicce and Magonsæte). Virtually everything known dates from a period after the initial establishment of the kingdoms: most documents were generated by the Church (and most documents are in Latin: as far,as I know, we have no surviving documents in the dialect of the area).

The Blair book is a magisterial study of the built form of Anglo-Saxon England. It covers many things but the takeaway for this discussion is that in general Anglo-Saxon material culture was such as to leave relatively few archaeological traces (wood, cloth, leather). Settlements may have left few archaeological traces. (The complex of dwellings associated with the East Anglian royal house is one of the things we have some evidence for - but even then it's basically the outlines of the foundations of the buildings.)

Menewood is a historical novel covering about two years in the life of Hilda (the Latinized form of her name) of Whitby, at about the age of twenty. Griffith hints at the end of the book what the next one will be about, with a view of the wider world.

However, "historical" is a slippery term here.

Once you get back to Hild's early days, a period of Christianisation, there is very little beyond the dates of battles and the deaths of kings, and none from contemporary documents. Bede has good coverage of what he is concerned with, but he is not a social historian, or even a general historian. (Bede would have known some people old enough to remember that period, much as I knew people, when I was young, who could remember Victoria's Jubilee. But he is a generation later.)

The battle at the climax of the book is an important event in Bede, where it is essentially a miracle validating King (later Saint) Oswald; it's essentially unrecognizable in the novel, in part because Griffith is being deliberately revisionist, but in part because the level of action the book covers is simply not recorded in anything remotely close to the period at all.

Let me be blunt: other than a few names and the dates of a few battles, we know almost nothing about the matter in Menewood. We know nothing about relative degrees of Christianisation; we know nothing about what Anglian paganism actually looked like; we have no clear idea of what the range and flexibility of gender roles was. We're even guessing about what people wore. We know about the names of kings and important churchmen and the broad sweep of their lives, with the odd illumination of little vignettes like Caedmon's vision.

Griffith's novel is technically plausible. There is nothing we know which prevents it from having happened. But it's wildly unlikely. It's unlikely on a level which makes Francis Crawford of Lymond look like a model of historical accuracy; at least everyone and everything he deals with is solidly grounded. (And nobody is making a pretence that Lymond is real; just almost everybody he deals with.)

It's a very good novel, but the term "historical fiction" is bring stretched to the breaking point. It's adjacent to (but never slips into) Alternate History as a branch of speculative fiction, as it preserves the space for the history we know to follow.

Books set in blank areas don't have to be quite like that. Sutcliffe's Sword At Sunset is about the even more poorly-attested Arthur, but it generally tries to keep to the way of the reasonably likely. (Stewart's Merlin books cross the boundary into spec fic by presenting Merlin's power as real.)

Griffith does know the background well. She's not slipshod or misleading about anything we can know. Her depiction of the (deeply problematic) ethos of the comitatus (about which we know a good deal, generally) is spot on, and her translation of Cadwallon's historical record into concrete terms is well thought-out. But the closer we get to Hild herself, the closer we get to a bubble of just-plausible improbability.

It's well worth reading, but take the idea that it's a guide to history of any sort with several large pinches of salt.
jsburbidge: (Default)
I

 Bientôt nous plongerons dans les froides ténèbres ;
Adieu, vive clarté de nos étés trop courts!
J’entends déjà tomber avec des chocs funèbres
Le bois retentissant sur le pavé des cours.

Tout l’hiver va rentrer dans mon être: colère,
Haine, frissons, horreur, labeur dur et forcé,
Et, comme le soleil dans son enfer polaire,
Mon coeur ne sera plus qu’un bloc rouge et glacé.

J’écoute en frémissant chaque bûche qui tombe;
L’échafaud qu’on bâtit n’a pas d’écho plus sourd.
Mon esprit est pareil à la tour qui succombe
Sous les coups du bélier infatigable et lourd.

Il me semble, bercé par ce choc monotone,
Qu’on cloue en grande hâte un cercueil quelque part.
Pour qui? – C’était hier l’été; voici l’automne!
Ce bruit mystérieux sonne comme un départ.

II

J’aime de vos longs yeux la lumière verdâtre,
Douce beauté, mais tout aujourd’hui m’est amer,
Et rien, ni votre amour, ni le boudoir, ni l’âtre,
Ne me vaut le soleil rayonnant sur la mer.

Et pourtant aimez-moi, tendre coeur! soyez mère,
Même pour un ingrat, même pour un méchant;
Amante ou soeur, soyez la douceur éphémère
D’un glorieux automne ou d’un soleil couchant.

Courte tâche! La tombe attend ; elle est avide!
Ah! laissez-moi, mon front posé sur vos genoux,
Goûter, en regrettant l’été blanc et torride,
De l’arrière-saison le rayon jaune et doux!

Charles Baudelaire, Les fleurs du mal
jsburbidge: (Default)

I have been generally inactive here (save for some comments) for the past while, as most of my online writing has gone into a much more technical blog of little general interest except to others involved in software development.

I expect to be more active here in the near future; the set of projects I was working on (and writing about) is largely complete.

jsburbidge: (Default)
 A few months ago, I ran across a reference by Jo Walton in one of her Tor reading lists to a work by Victoria Goddard, whom I had never encountered before. A quick check around the net revealed a large number of very positive reviews of Goddard's work, so I decided to check her work out.

(Note that as Goddard is self-published, her work is quite reasonable in price if bought as e-books but fairly pricey if bought as hardcopy. E-books are available directly from her website or via various other sites (though not from Google Books).)

Goddard is good, and worth recommending, although she is not quite as good as many of her more enthusiastic reviewers would make her out to be. The discussion below is (of necessity) rather full of spoilers.

Spoilers below... )
jsburbidge: (Default)
 There's an old trick in searching a C-style string that you actually own (i.e. you can't do this in an implementation of strchr(), which requires a const argument): you can use a sentinel to speed up searching.
 
If you have a 300-character long C-style string, and it's not in read-only memory and is safe to modify, and you want to find the first instance of a character in the string, you can use std::strchr(), but that requires that at every point in the search the function has to check for two conditions: first, is it the character that you are searching for, and, secondly, is it the null/end-of-string character. Or you can very temporarily assign the value of the character you are looking for to the location of that terminal null and use memchr() instead, changing it back when the call is over. If you get the address of the artificial sentinel, there were no instances in the string.
 
Note that in this context this is a pure, old-fashioned optimization, of the sort that you don't do unless it's genuinely useful, as it complicates the code and makes it more brittle. That being said, it's a very long-established trick which shouldn't be particularly confusing.
 
However, there are other domains where using a sentinel can be a win from the design and maintenance as well as the efficiency perspective.
 
I had a set of conditions, settable at run time by command-line options, which set filters on a set of records. These were implicitly anded together - if a record didn't match a criterion, it was out. (There was special handling to allow two or more conditions of certain types to be ored together.)
 
Writing this as a hardcoded set of if/else choices would have been horrendous. So I implemented them as a family of function objects stored in a vector. The conditions could be set up initially in a fairly straightforward manner. If you set the most common reasons for filtering out up front, you could optimize to reduce the number of comparisons. The conditions could be set up in a simple std::any_of call.
 
However, if there was filtering out, there was still some stuff to do; this wasn't a simple case where you have to do stuff only when an item was found. So it looked like
 
if (std::any_of(a.begin(), a end(), [&](const auto& arg) {... condition ...})
(
// Do some stuff
// Return X
}
else
{
// Do some other stuff
// Return Y
}
 
This is ugly. It's maintainability isn't awful, but it's not great, either. And every run has an extra branch after the end of the STL algorithm.
 
(Branching can mess with pipelining and slow down performance. This is in addition to being, frequently, maintenance problem points. In many cases both clarity and efficiency argue in favour of replacing branching by polymorphism.)
 
But I had created these function objects. I could do anything I wanted with their interfaces. (If I hadn't, I could have used a wrapper.) So I added an additional execAfterFind() function to the function object, and all of the real criteria for exclusion had a (common, inherited) implementation corresponding to the if part of the test. I then created a new type which *always* matched and placed it at the end of the vector of tests in every case. It, and it alone, had an implementation of the new function corresponding to the else branch.
 
Now the call looked roughly like this:
 
auto foo = std::find_if(a.begin(), a end(), [&](const auto& arg) {... condition ...});
foo->execAfterFind();
 
This is cleaner overall, not only at this site. What about performance?
 
For the case where the record ends up not being filtered out, there's probably no gain: unless a really good optimizing compiler optimizes the test away on the "always matches" object through a polymorphic call (unlikely) the new object just moves an if/else test around. There might be a small cache benefit because all the object tests with their functions were allocated one after another and we just *might* have improved some cache locality, but I wouldn't count on that, either.
 
However, most records are expected to be filtered out. Consider a record that gets booted by the first, most likely, test. In the old implementation there were two successive branches, one for the test, one for the branch after the STL algorithm has run its course. Now there is only the one branch. So we probably gained overall; we're certainly not likely to have made anything worse. So we have an improvement in readability / maintainability and efficiency, both at once.
 
If your concern were strictly time optimization, and you have the space, and a known, small enough, set of criteria, by the way, this is not the way to go about it. For that you give every condition its own return value as a different power of 2 and use std:: accumulate rather than anything with a test. After running std::accumulate you can use if/else if all you care about is matching at all; otherwise, use an array of 256 (or 128, or whatever best suits your use case; table sizes corresponding to larger sets are probably not ideal unless you really need the speed over the space[1]) function objects addressed from a known point and just invoke them with the returned value as the array index. I do not recommend using jump tables of this sort as an approach supporting more maintainability, though: they are tremendously fragile in the context of code changes. They can be extremely fast.
 
Even a simple array of two functions can be used if you are setting values as only 1:
 
std::array<std::unique_ptr<IFoo>, 2> funcs;
//Set up array
...
int val = std::accumulate(... parameters including initial value of 0...);
funcs[val].execAfterFind();
 
The drawback is that you always visit every test; any_of and find_if truncate your search. You'd have to give very, very careful thought to whether this would actually be a benefit or a cost, and you probably want to do careful profiling over a range of cases. (In the case I had, the majority of records would be screened out quickly; this would not have been an appropriate solution. If most had been retained, that would be another question.) The other drawback is that the table setup is rather more complex and uglier than the preparation for the sentinel approach.
 
[1] If you have more than 8 tests then the gains from not branching are going to be counterbalanced by the need to process all eight tests rather than short-circuiting as find_if does.
jsburbidge: (Default)
 The shepherds sing; and shall I silent be?
My God, no hymn for thee?
My soul’s a shepherd too; a flock it feeds
Of thoughts, and words, and deeds.

The pasture is thy word: the streams, thy grace
Enriching all the place.
Shepherd and flock shall sing, and all my powers
Out-sing the day-light houres.

Then we will chide the sunne for letting night
Take up his place and right:
We sing one common Lord; wherefore he should
Himself the candle hold.

I will go searching, till I finde a sunne
Shall stay, till we have done;
A willing shiner, that shall shine as gladly,
As frost-nipt sunnes look sadly.

Then we will sing, shine all our own day,
And one another pay:
His beams shall cheer my breast, and both so twine,
Till ev’n his beams sing, and my musick shine.

-- George Herbert

Filtering

Dec. 21st, 2022 09:55 am
jsburbidge: (Default)
 If you (for C++ developers values of you) happen to be in the happy possession of a C++20 compiler, one facility it provides is the range-based filter view which allows for iterating over a range while filtering out certain elements.

If you don't have one, there are several options.

At a simple level, for use with for_ each(), there's simple composition. If you have a filter predicate Pred for Foo:

class Pred
{
public:
bool operator()(const Foo& inVal) const;
};

And a functor that does something with them, Op:

class Op
{
public:
void operator()(const Foo& inVal);
};

you can always create a composite:

class FilteredOp
{
public:

FilteredOp(Op& inOp, const Pred& inPred);

void operator()(const Foo& inVal)
{
if (m_op(inVal))
m_pred(inVal);
}
private:
Op& m_op;
const Pred& m_pred;
};

(We will refer to this as the naive version. This could easily be turned into a template to do more general composition.)

This works just fine - if all you want to invoke is for_each(). But if you want to use, e.g. transform() or rotate_copy(), it won't work. (Some operations provide a filtered option with their _if variants. Many do not. Many of those operate in such a way that a valid return value is expected for every application. In other cases, e.g. sample(), there is no predicate functor to be extended in this way.)

It is also very slightly more elaborate to write

Op op;
FilteredOp fop(op, Pred());

std::for_each(seq.begin(), seq.end(), fop);

than, say,

Op op;

filtered_for_each(seq.begin(), seq.end(), op, Pred());

even if you legitimately want to use for_each(), but only very slightly.

(The same applies a fortiori if FilteredOp is a closure; the difference lies in how closely you have to look at what is happening to discern intent; a closure has no name to assist the maintainer.)

The next alternative, if you have C++11, is to create a temporary filtered copy using copy_if:

typedef Seq std::vector<Foo>;

Op op;
{
Seq tempSeq;
std::copy_if(seq.begin(), seq end(), std::back_inserter(tempSeq), Pred());

std::for_each(tempSeq.begin(), tempSeq.end(), op);
}

This is not a great improvement on the naive version, and costs more. It does avoid multiplying entities. The big downside is that if you are processing the filtered data once, the copying costs in both time and memory.

It has the advantage of bring idiomatic. And, of course, it works for a use of std::sample().

It is a better choice if you want to process the filtered data in any way more than once - by far the best choice, as the costs of subsequent iterations will be cut by the initial filtering, unless you have memory constraints. (Also note that in C++20, you can separate the filtered and unfiltered elements by using remove_copy_if and have two sequences ready for subsequent operations.)

One other advantage of the copy_if approach is that you can change the nature of the collection - you can, for example, iterate over a vector and insert into a set, effectively carrying out a sort on your filtered items at the same time. This may not be as efficient as copying to a vector and then applying a sorting algorithm - but again, a second stage in processing of this type is the sort of thing the copy_if approach enables.

The other alternative is to turn to boost. Boost has a filter iterator, which skips elements satisfying Pred without doing any modifications. Thus:

Op op;

std::for_each(boost::filter_iterator<Pred, Seq::iterator>(tempSeq.begin()), boost::filter_iterator<Pred, Seq:: iterator> (tempSeq.end()), op);

This works generally.

If you need to filter once only, then this is preferable. (It can also be used to emulate copy_if if you have a C++03 compiler but also have boost, by using it with std::copy.) If you need to operate on the filtered set more than once, it is suboptimal, since every iteration has to visit every element in the full sequence each time - unless you are optimizing memory (large sequences) and care less about time; this is the option using the least memory.

You can compose filters if you need to. This gets confusing unless you use typedefs.

Whether it's idiomatic or not depends on how much you consider boost fundamental. The naming does declare exactly what you are doing, though.

The original use case that got me thinking about this was one with a switch driven by context. If a flag was set, we iterate over everything; if not, we iterate over just the subset. In a for_each example, the naive implementation looks like:

Op op;

If (flag)
{
FilteredOp fop(op, Pred());

std::for_each(seq.begin(), seq.end(), fop);
}
else
std::for_each(seq.begin(), seq.end(), op);

The copy_if example looks like:

Op op;

If (flag)
{
Seq tempSeq;
std::copy_if(seq.begin(), seq end(), std::back_inserter(tempSeq), Pred());

std::for_each(tempSeq.begin(), tempSeq.end(), op);
}
else
std::for_each(seq.begin(), seq.end(), op);

This can be simplified by filtering in a function

class Filter
{
public:
const Seq& getFilteredRange(const Seq& inSeq, book inFlag)
{
if (inFlag)
{
std::copy_if(inSeq.begin(), inSeq end(), std::back_inserter(m_temp), Pred());
return m_temp;
}
return inSeq;
}

private:
Seq m_temp;
};

Filter f;
Op op;
const Seq& toProcess = f.getFilteredRange(seq, flag);

std::for_each(toProcess.begin(), toProcess.end(), op);

Effectively the generator replaces the FilteredOp class, so it's a tradeoff in complexity but clearer at the call site.

The functor can avoid using if/else if implemented as a strategy. This is useful if it will be used multiple times, always with the same value of flag (e.g. passed at the command line).

The boost example looks like:

Op op;

If (flag)
std::for_each(boost::filter_iterator<Pred, Seq::iterator>(tempSeq.begin()), boost::filter_iterator<Pred, Seq:: iterator> (tempSeq.end()), op);
else
std::for_each(tempSeq.begin(), tempSeq.end(), op);

What about that notional filtered_for_each I threw in at the beginning?

Well, it can just be implemented by wrapping the boost version in a template function call. I'm not sure that the syntactic cleanup is better than a typedef, though. Once you have

typedef boost::filter_iterator<Pred, Seq::iterator> PredFilteredIterator;

instead of

template<typename T, typename U, typename V> void filtered_for_each(V& begin, V& end, T& inOp, const U& inPred)
{

std::for_each(boost::filter_iterator<U, V>(begin),
boost::filter_iterator<U, V>(end), inOp);
}

(declaration more complex than that, and needing more policies, but you get the picture...)

then

filtered_for_each(seq.begin(), seq.end(), op, Pred());

versus

std::for_each(PredFilteredIterator(tempSeq.begin()), PredFileredIterator(tempSeq.end()), op);

isn't a big improvement in clarity, and involves a lot more finicky work getting the function definition both correct and general.

It's generally a good idea to avoid for_each() as less expressive (and often less efficient) than the more specific algorithms in the STL. And anything that is complex enough that it doesn't fit any more specific algorithms is frequently complex enough that adding the filtering logic internally on a custom basis may make sense. (An example would be something aimed at generating elements in a collection based on another collection, but with a variable number based on the characteristics of the input parameter. This will not work with std::transform or std::generate_n. If you already have selection logic, integrating filtering logic may very well be more efficient on a custom basis inside your functor than doing so via any form, direct or indirect, of composition. Likewise, if you are processing a set of inputs and converting them into database insertions but skipping some, the field access you are doing to build the database insertions can double for filtering.)

In general, too, using a more precisely targetted algorithm instead of for_each() tends to move complexity out of the functor you have to write. In some cases it can move a lot complexity into library code. (Using remove_if() plus erase() is much, much simpler than implementing the behaviour in a general for loop of any sort.) But even using std::transform plus an inserter to fill a container means that you have separated out the code dealing with the target container from the code doing the element generation, even though the container logic remains in the application space.

For all these reasons putting effort into writing an extended for_each is probably always using energy and attention which can be better expended elsewhere.

Matt Wilson's Extended STL has a chapter on the implementation of filtering iterators. It may be worth emphasizing one thing that he notes; a filtered iterator cannot have the semantics of a random access iterator: not only will indexing be an issue, but so will the concept of distance between two iterators; both can in theory be supported but only a considerable expense, and may give unexpected values. (If we apply a filter F to a sequence of length 10, the effective length of the filtered sequence can't be determined without traversing the entire sequence, and an indexing operator might not even be anything but the end of the sequence (at one extreme) depending on how many elements were filtered out). If you need random access semantics, using copy_if to generate a standard sequence is by far a preferable option.

Editing

Dec. 4th, 2022 05:29 pm
jsburbidge: (Default)
When I was a student, back in the day, I had an Elite portable typewriter which had belonged to my grandfather. (The fact that it was Elite meant I got an extra couple of lines of words per page: N-page essay requirements always assumed Pica, which had fewer lines per page.)

My practice throughout my university days was, invariably, to write out every paper in longhand, edit the draft heavily, and then transfer the edited draft to the typewritten form for handing in, doing a second edit as I went.

A period in publishing as an editor taught me the standard markup formats, which I had not bothered with up until that point. But by that time computers were coming in and editing tended to become a continuous process onscreen; I rarely had the opportunity to use paper editing on my own texts.

(I may add, in passing, and as qualification in what follows, that I still think that printing out and editing a text is the only really effective way to end up with a good text. I have occasionally edited source code in this way. It is the best way, bar none, to attain to brevity.)

These days all my work is onscreen; i don't even have a printer. I do, however, find that the discipline of separating writing and editing remains critical.

I will regularly make a first draft of a class, sleep on it, and decide, on sleeping on it, that the design needs significant changing. This is too close to the original composition to be refactoring; it is, fundamentally, part of the original design process, with no re- about it.

Much of the time this leads to simplification; when it does not, it is because it leads to generalization, more complex in one place, less complex overall.

(I might as well throw in a note about "emergent design". I tend to agree with James Coplien that design is something which has to happen as its own discipline, and can't just emerge from work with concrete classes plus some general patterns principles. When I work on a feature, or on a bug once it has been analyzed, I never work without hewing to an explicit design, even when that design is not written down. But from that perspective design is not so much part of composition but its prerequisite: I couldn't have done that longhand writing without a good sense of what my overall structure was to begin with.)

A lot of the code that I see in production looks as though it was produced by developers who left off as soon as they got the first draft that actually worked. It's verbose and full of copy-and-paste antipatterns. Boolean flags are used instead of strategies and in some extreme cases independent access to global variables is used instead of parameter passing.

It uses idioms which were learned early but are not optimal. For example, most developers started out writing loops using for and while; but in C++ maintainability, clarity, concision, and speed of execution are better served by using the STL algorithms. One might draft out one's thoughts using a for loop, but finished code should have the additional thought put into it of using an appropriate algorithm.

In all cases these are less clear and less maintainable and in most cases also less efficient at runtime. But it's a first cut that's left in that state because it works and people won't edit.
jsburbidge: (Sky)
 I see headlines talking about risks to democracy after the election of a far-right party in Italy. I do not see headlines suggesting that that election shows the weaknesses of democracy in action.

Populist leaders tend not to be antidemocratic, at least by inclination. Many of the distinctive policies of the right-wing populist parties have heavy popular support - though they are policies which tend not to be supported by the major parties, and are frequently policies which run directly into constitutional limits in countries which have such limits (sorry, UK).

A simple example is the death penalty for murder.  In Canada, for years if not decades after the death penalty was abolished, it had broad general support in the population. None of the major political parties supported it (partly because it's very hard to find lawyers who support it - they are too much aware of the possibilities of miscarriages of justice) and so it remained off the agenda of Parliament.

The policies of the current Quebec government under Legault regarding dress and religious symbols, and restricting language choice, have broad support in the province as a whole - so much so that the various federal parties are unwilling to oppose them publicly - but would run directly into Charter challenges were it not for the use of the Notwithstanding clause.

Anti-immigrant policies play well with general populations almost everywhere. Opposition tends to come from an odd alliance of progressives and business groups (who need the labour pool).

The recent experience of COVID, and the current rush back by a majority of the population to "normalcy", including not wearing masks in public (which is, when you consider it, a pretty minimal-cost step) isn't just driven by oligarchic leaders (however much they want people back in the office [1]) but comes up from the grassroots. It does lead to a lack of confidence in the judgement if the people asawhole on other issues.

Populations in general are covering their ears regarding appropriate steps to take on climate change. Acceptance of anthropogenic causes has become general, but willingness to take steps with any immediate cost is present in only a tiny segment of the population.

Much general discourse treats democracy as an end in itself. It isn't. To begin with, "representative democracy" is not, at least as practiced, democracy; it's a way of selecting between governments generally made up of representatives of much smaller slices of the population, generally in the top quintile of income.  This is further tempered in many jurisdictions by permanent civil services (ENArques in France, at an extreme) who represent a broad professional consensus of what policies are acceptable.

Secondly, most jurisdictions constrain political rulemaking by constitutional bills of rights.  These provisions regularly get applied. In some cases (the US Second Amendment, for example) there may be serious issues around the nature of the constraints, but most such rights are unambiguously "good" in principle. Consider the regular striking down of things like minimum sentencing provisions under the Charter, or rulings providing immigrants with some rights of review of immigration board decisions.

Democracies have typically worked better than other choices because they impose more constraints on arbitrary exercise of power. These constraints are intermittent - Liz Truss is essentially an unelected dictator until the next general election (unless she falls to internal party revolt) but they do exist.

I, at least, do not as such want a democratic government so much as I want a just, prescient, and wise government. Unfortunately, nobody has ever devised a method to select for justice, prescience, and wisdom in the rulers.

Churchill's aphorism applied to this. The ideal government may very well be a truly enlightened despot, but it's difficult to find good monarchs, let alone genuinely enlightened ones.[2] Democracy has been the leat bad model we have.

Democracies seem to have worked at their best when rising tides are lifting all boats. But if one current factor in the failure of governments generally to confront issues such as climate change is the failure to counter the pressure of money in politics, a more fundamental failure is the visible strong tendency of populations as a whole, when insecure, to prefer easy but obviously wrong nostrums peddled by populists to realistic but more challenging fixes.

So we have figures like Johnson and Truss, in England, or Poilievre and Smith, in Canada, or Trump and De Santis in the US, or Meloni in Italy, who  peddle long-term poison not despite, but because of, the broad wishes of the population.

The problem, as always, is finding a better solution. There is no obvious practical one - i.e. one reachable from here - on the horizon. And any path which could reach a different structural model would likely have to wade through a fair amount of blood to get there.

[1] Going by the messaging of my own employer's higher echelons, I think that they would be happy to see the offices full of employees all wearing masks, especially as the latter reduces the incidence of sick leave. Instead what they are getting sparse attendance, but almost everyone who shows up is not wearing a mask.

[2]Most monarchs historically were not unconstrained despots; they did a careful balancing act between competing groups of nobles. If the nobility as a whole turned against you, you were gone, or at least in deep trouble (John, Edward II, Richard II, Henry VI, at least, in England).
jsburbidge: (Default)
 I am not, in general, a great defender of As You Like It. The two themes in the air of the time which it made fun of - pastoral and melancholy - have long passed out of the common ken; it has even less plot than most Shakespeare comedies; it has remained popular primarily because of the character of Rosalind.

That being said, it has its points.  It is full of clever speech; it is, I believe, the first Shakespeare play with a Robert Armin fool rather than a Will Kemp fool, and so the first of his philosophic fools. It has a really clear distinction between the comic and everyday worlds which makes it a sort of concentrated template for Shakespeare's other festive comedies. ("How full of briars is this workaday world" over against Arden).

The performance by the Canadian Stage Company at the Dream in High Park mangled the play so badly that none of its virtues survived. This wasn't just the usual cutting in the interests of length, though it involved liberal cutting. It also involved adding extended amounts of slapstick, not just where the text might call for it, but in many places where it could get in only by beating the text over the head. It has the most distracting costumes; I gather, after the fact, that for some reason the production presents all the characters as flowers. The court looks just as bizarre as Arden.

Much of Jaques and Touchstone was mangled or dropped, much to the detriment of both. At least they were not among the players who simply shouted their lines, or abbreviated versions of their lines.

It was, in short, an appalling production. I am disinclined to see any of their other productions.
jsburbidge: (Default)
There is a post on Charlie Stross's blog regarding a pledge by Rishi Sunak to eliminate degrees which lead to less well-paying jobs. Aside from noting that such a programme would likely lead to the abolition of Greats, the degree held by the current PM - as a rule, degrees in classics are not roads to riches - this would seem to be irrelevant, as Tory party members seem to give Liz Truss a pronounced edge (not because she's any brighter, but because she is more in their image).

But what is the value of a university degree? In the STEM area, generally, the "useful" (engineering end of the scale) degrees apparently now have a genuinely useful life of about five years. If you have a degree in pure math, it doesn't age at all, but it is about as useful as a degree in philosophy (which also doesn't age at all, at least if it covered core subjects).

On the other hand, my current employer not only wanted proof of my degrees from the early 1980s in an unrelated field (well, two unrelated fields) but apparently had the same demand of a colleague whose degree is from the late 1970s. It didn't care what they were in - experience rendered that irrelevant - but they certainly wanted proof of graduation.

Whatever Sunak believes, most degrees which are not specialized professional degrees have about the same value: employers want "a degree" for a vast number of middle-class office jobs and don't particularly care what in. For all that university calendars pitch the practical application of the most abstract of disciplines to students, a course of studies spent studying Peter Abelard, Guido da Montefeltro, and Dante is generally just as useful as a credential as one spent studying the most "relevant" of subjects.

Not that the former is very likely, these days. Many if not most smaller universities have abandoned anything even loosely related to the kinds of education which would satisfy anyone with a real appetite for systematic or eccentric knowledge. (Larger universities retain them because they need to support schools of graduate studies across a full range of disciplines.)

I have three degrees, each with its own lesson in later years.

The first was a BA from Trent University. In those days - which are now, I gather, considered part of the "early days" despite my very clear sense that I was nearly a decade after the real early days - it was a reasonable place to go for a small-class humanities degree, even if its tutorial system did not approach real Oxbridge tutorials. I did a major in English Literature and minors in Mathematics and Classics. What I did would now be impossible; the calendar no longer supports the courses I took.

My second was an MA taken with the course work from a doctoral programme at The Johns Hopkins University. I got out because I disagreed with where the discipline (and the humanities in general) were going. I cannot say in retrospect that my assessment was mistaken.

I then proceeded to a law degree at the University of Toronto. I was really the only student in my year who approached it out of an interest in law as such, and got the greatest amount out of courses in jurisprudence and legal history, including a directed research course in legal history. I did not get an offer to article at any of the firms I interviewed at. However, the degree did give me the one actual "practical" use of any of my degrees: it gave me a foothold as a legal editor at a Toronto publishing firm.

While there, I eventually shifted function and became a software developer, which is a long and complex story in itself. By the end of the 1990s I was experienced enough to get a place at a dot com startup, and went from there into development in the financial sector. At no time from that time on did anyone ever show any interest whatsoever in what I had studied at university, or what my grades were.

In retrospect what I "should" have done from a professional point of view was taken the Descartes scholarship the University of Waterloo was happy to offer me and, instead of taking pure math (which was my then current interest) should have taken a course in math and computer science. I would have taken a short cut of nearly 15 years to the same career with better credentials and a better choice of employers. I'm not sure that would have been my best choice otherwise; my collections of classics and mediaevalia argue otherwise. (Though there's certainly an argument to be made that taking a second bachelor's degree in Computer Science rather than going to law school would have been a better idea.)

So what was the economic value of the degrees I have? Relatively limited; indeed, a single four-year degree that I did not take would have almost certainly had a bigger impact than the three degrees I did take. Their benefit was not at the vocational level but at a purely intellectual level. Most of the skills I have I had when I graduated from high school, although with less practice (with the exception of software development, which I did not take up until after I had finished university entirely).

There is a frequently made case for abstract knowledge that it eventually turns out to be more useful than practically-directed research (a classic example is the applicability of Lie algebras to particle physics; or, a level down, of understanding of particle physics and quantum mechanics to the use of semiconductors in computing). I am more inclined to make the argument that abstract knowledge is a value in itself, and that the willingness to support the extension of abstract knowledge is one of the things society is judged on.

Profile

jsburbidge: (Default)
jsburbidge

April 2025

S M T W T F S
  12345
67 89101112
13141516171819
20212223242526
27282930   

Syndicate

RSS Atom

Most Popular Tags

Style Credit

Expand Cut Tags

No cut tags
Page generated Jun. 26th, 2025 04:36 pm
Powered by Dreamwidth Studios