1
[-] Historical_General@lemm.ee 2 points 7 months ago

If they tied a bookwyrm comments section to an ISBN number for example then anybody/site could easily have it embedded to make it a universal tool rather than specifically connected to a piracy site.

[-] Historical_General@lemm.ee 1 points 10 months ago

Yeah, it's silly and odd and likely done to push customers towards formats that they have greater control over.

Those epubs that aren't really epubs, randomly disallowing azw3 files (that they support officially!!!) from being downloaded directly from the kindle's built in browser and other restrictive behaviour are part of this. That's why I'm eventually looking to enable epubs on kindle once the people at mobileread find a way to do it. Apparently calibre can be set up to send files too via email so that's another option.

[-] Historical_General@lemm.ee 1 points 11 months ago

They're not though. They only do over the cloud conversions from epub to an amazon proprietary format, that can make the covers or formatting go awry.

[-] Historical_General@lemm.ee 8 points 11 months ago

I'm envisioning Bookwyrm behaving as a comments section for anna's archive (possibly all/any decentralised book repositary), but they'd be reviews instead. I'm reminded of discus or facebook that you often get embedded on certain sites.

[-] Historical_General@lemm.ee 5 points 11 months ago* (last edited 11 months ago)

It's not worthless, there's 500 million dollars worth of gas north of Gaza that Israel wants to secure. They're already stealing it and have been for years. And the 75 year long occupation must end of course.

[-] Historical_General@lemm.ee 2 points 11 months ago

I would have thought this was common knowledge. I suspect these redditors just don’t put any effort into recall or thinking in general.

[-] Historical_General@lemm.ee -3 points 11 months ago

No you dimwit. I read the papers. Normal people do do that. Dimwit.

[-] Historical_General@lemm.ee 17 points 11 months ago

Well this will fix the various social crises in that country for sure.

[-] Historical_General@lemm.ee -1 points 1 year ago* (last edited 1 year ago)

Gay rights are human rights.

25

cross-posted from: https://lemm.ee/post/12865151

Witch-hunting in 17th-century Scotland was so well paid that it attracted some blatant fakers – Susan Morrison

A witch-hunter nicknamed ‘The Bloody Juglar’ appears to have used a retractable needle to prick his victims without drawing blood, while another responsible for the deaths of many innocent women turned out to be a woman herself


At Spynie Palace in 1662, John Innes of Leuchars had a serious problem on his hands. Local people were complaining to him about milkless cows, shrivelling crops and dying children. Pretty obvious that a witch was on the loose. As the local law enforcement thereabouts, John was expected to do something, but witch-hunting was not in Mr Innes’s skill set.

It must have been a relief when a slight young man almost magically appeared in front of him: John Dickson’s the name, and witch-hunting’s the game. Bags of experience. Happy to sort the problem out. Possibly dropped the name of superstar witch-hunter John Kincaid into the conversation, a Tranent man with a fearsome reputation as Scotland's most fearsome witch pricker or ‘brodder’.

The Scots didn't do witch-ducking. We went for the needle. The Devil gave his followers marks somewhere on their bodies. Where the Devil left his mark, there would be no blood, and no pain. Kincaid and his like would use the needle to ‘prick’ the accused. The words prick and needle are misleading. This needle was no dainty thing to be lost easily in a haystack. These were more like hefty great crochet hooks. The ‘pricking’ was more of a violent slam into the body.

The mark could be anywhere. The accused were stripped and shaved, and the needle plunged in. Some victims didn’t move, scream or bleed – the mark had been found. Possibly they couldn’t move. They may have been in deep shock. These were pious times.

Women rarely left home without covering their heads, now they stood publicly naked, shaved and exhausted. There may well have been little or no bleeding, if the needle hit a part of the body with a poor blood supply. Or perhaps the needle was retractable.

There are clues to such trickery. In the late 17th century, a witch-hunter nicknamed “The Bloody Juglar” turned up in Berwick-upon-Tweed. Pretty quickly his trusty needle pricked a victim and drew no blood. A witch, ready for trial and execution. Hold up, said Colonel Fenwick, the town’s military governor. He called in the mayor and the magistrates. He was worried that this evidence was falsely procured. He had his suspicions about that needle.

Why not get The Bloody Juglar to do the pricking again, but with a council-provided needle? Our boy baulked – “by no means would he be induced unto”. To the good people of Berwick, this “was a sufficient Discovery of Knavery”. The Juglar was busted.

John Kincaid may have been a knave, but between 1649 and 1662 he rampaged freely. It was lucrative. He pocketed £6 for a discovery of a witch at Burntcastle estate. They chucked in another £3 to cover the booze bill for him and his manservant.

The year 1659 was a busy one. Kincaid seems to have pricked profitably in East Lothian, where 18 accused witches were executed. In 1661, Forfar was so chuffed with his efforts that they gave him the freedom of the burgh.

Perhaps young John Dickson was inspired by Kincaid. Seemed a good trade for a lad, finding God's enemies and being very well paid for it, too. John headed north, fetched up at Spynie Palace and appeared before the harassed Innes, who wasted no time in signing up his new witch-hunter to an exclusive contract.

John was on a good retainer with performance-related bonuses, six shillings a day expenses plus £6 per witch caught. In no time at all, our man on the make had two servants and a very fancy horse. He was on-call and carried out witch-pricking in Elgin, Forres, Inverness and Tain. He possibly pricked Isobel Goudie, Scotland’s most famous witch.

He had a particular take on the procedure. Folk called him the Pricker “because of his use of a long brasse pin”. He had his victims stripped naked, then the “spell spot was seen and discovered. After rubbing over the whole body with his palms.” In a vicious witch-hunt/clan war in Wardlaw on the banks of Loch Ness, 14 women and one man were treated so savagely under John’s direct supervision that some of them died.

Our boy was on a roll, until he did something stupid. He pricked a man named John Hay, a former messenger to the Privy Council. Now, this was not a man to mess with. He had connections. He wrote to Edinburgh complaining in an incredibly civil servant manner, denouncing the witch-pricker who worked on his case as a “cheating fellow” who carried out the torture without a licence. Even witch-hunters need the correct paperwork.

The Privy Council in Edinburgh agreed. They called the maverick Mr Dickson in for a word. And they made a terrible discovery: John Dickson was a woman. Her name was Christian Caddell, and she came from Fife. Oh, she could tell a witch, no doubt about it. She claimed she spotted them by looking into their eyes and seeing an upside-down cross.

Of course, this was not the scientifically accepted manner of witch-finding. A needle must be used. And, obviously, you needed to be a man.

Christian stood trial, not for fake witch hunting, torturing or even for those murderous deaths, but for wearing men’s clothing. She was sentenced to transportation, and on May 6 she sailed from the port of Leith on the ship Mary, bound for Barbados.

On the day she left Scotland, Isobel Elder and Isabel Simson, pricked by John Dickson, aka Christian Caddel, were burned in Forres. Just because you were discovered to be a witch in the wrong way didn’t mean to say you were innocent. They were the last two victims of the cross-dressing counterfeit witch-pricker.

12

cross-posted from: https://lemm.ee/post/12865151

Witch-hunting in 17th-century Scotland was so well paid that it attracted some blatant fakers – Susan Morrison

A witch-hunter nicknamed ‘The Bloody Juglar’ appears to have used a retractable needle to prick his victims without drawing blood, while another responsible for the deaths of many innocent women turned out to be a woman herself


At Spynie Palace in 1662, John Innes of Leuchars had a serious problem on his hands. Local people were complaining to him about milkless cows, shrivelling crops and dying children. Pretty obvious that a witch was on the loose. As the local law enforcement thereabouts, John was expected to do something, but witch-hunting was not in Mr Innes’s skill set.

It must have been a relief when a slight young man almost magically appeared in front of him: John Dickson’s the name, and witch-hunting’s the game. Bags of experience. Happy to sort the problem out. Possibly dropped the name of superstar witch-hunter John Kincaid into the conversation, a Tranent man with a fearsome reputation as Scotland's most fearsome witch pricker or ‘brodder’.

The Scots didn't do witch-ducking. We went for the needle. The Devil gave his followers marks somewhere on their bodies. Where the Devil left his mark, there would be no blood, and no pain. Kincaid and his like would use the needle to ‘prick’ the accused. The words prick and needle are misleading. This needle was no dainty thing to be lost easily in a haystack. These were more like hefty great crochet hooks. The ‘pricking’ was more of a violent slam into the body.

The mark could be anywhere. The accused were stripped and shaved, and the needle plunged in. Some victims didn’t move, scream or bleed – the mark had been found. Possibly they couldn’t move. They may have been in deep shock. These were pious times.

Women rarely left home without covering their heads, now they stood publicly naked, shaved and exhausted. There may well have been little or no bleeding, if the needle hit a part of the body with a poor blood supply. Or perhaps the needle was retractable.

There are clues to such trickery. In the late 17th century, a witch-hunter nicknamed “The Bloody Juglar” turned up in Berwick-upon-Tweed. Pretty quickly his trusty needle pricked a victim and drew no blood. A witch, ready for trial and execution. Hold up, said Colonel Fenwick, the town’s military governor. He called in the mayor and the magistrates. He was worried that this evidence was falsely procured. He had his suspicions about that needle.

Why not get The Bloody Juglar to do the pricking again, but with a council-provided needle? Our boy baulked – “by no means would he be induced unto”. To the good people of Berwick, this “was a sufficient Discovery of Knavery”. The Juglar was busted.

John Kincaid may have been a knave, but between 1649 and 1662 he rampaged freely. It was lucrative. He pocketed £6 for a discovery of a witch at Burntcastle estate. They chucked in another £3 to cover the booze bill for him and his manservant.

The year 1659 was a busy one. Kincaid seems to have pricked profitably in East Lothian, where 18 accused witches were executed. In 1661, Forfar was so chuffed with his efforts that they gave him the freedom of the burgh.

Perhaps young John Dickson was inspired by Kincaid. Seemed a good trade for a lad, finding God's enemies and being very well paid for it, too. John headed north, fetched up at Spynie Palace and appeared before the harassed Innes, who wasted no time in signing up his new witch-hunter to an exclusive contract.

John was on a good retainer with performance-related bonuses, six shillings a day expenses plus £6 per witch caught. In no time at all, our man on the make had two servants and a very fancy horse. He was on-call and carried out witch-pricking in Elgin, Forres, Inverness and Tain. He possibly pricked Isobel Goudie, Scotland’s most famous witch.

He had a particular take on the procedure. Folk called him the Pricker “because of his use of a long brasse pin”. He had his victims stripped naked, then the “spell spot was seen and discovered. After rubbing over the whole body with his palms.” In a vicious witch-hunt/clan war in Wardlaw on the banks of Loch Ness, 14 women and one man were treated so savagely under John’s direct supervision that some of them died.

Our boy was on a roll, until he did something stupid. He pricked a man named John Hay, a former messenger to the Privy Council. Now, this was not a man to mess with. He had connections. He wrote to Edinburgh complaining in an incredibly civil servant manner, denouncing the witch-pricker who worked on his case as a “cheating fellow” who carried out the torture without a licence. Even witch-hunters need the correct paperwork.

The Privy Council in Edinburgh agreed. They called the maverick Mr Dickson in for a word. And they made a terrible discovery: John Dickson was a woman. Her name was Christian Caddell, and she came from Fife. Oh, she could tell a witch, no doubt about it. She claimed she spotted them by looking into their eyes and seeing an upside-down cross.

Of course, this was not the scientifically accepted manner of witch-finding. A needle must be used. And, obviously, you needed to be a man.

Christian stood trial, not for fake witch hunting, torturing or even for those murderous deaths, but for wearing men’s clothing. She was sentenced to transportation, and on May 6 she sailed from the port of Leith on the ship Mary, bound for Barbados.

On the day she left Scotland, Isobel Elder and Isabel Simson, pricked by John Dickson, aka Christian Caddel, were burned in Forres. Just because you were discovered to be a witch in the wrong way didn’t mean to say you were innocent. They were the last two victims of the cross-dressing counterfeit witch-pricker.

16

cross-posted from: https://lemm.ee/post/12600657


Seventeenth-century English antiquarians thought that Stonehenge was built by Celtic Druids. They were relying on the earliest written history they had: Julius Caesar’s narrative of his two unsuccessful invasions of Britain in 54 and 55 BC. Caesar had said the local priests were called Druids. John Aubrey (1626–1697) and William Stukeley (1687–1765) cemented the Stonehenge/Druid connection, while self-styled bard Edward Williams (1747–1826), who changed his name to Iolo Morganwg, invented “authentic” Druidic rituals.

Druidism has come a long way since. In 2010, The Druid Network was listed as a charity in England and Wales, essentially marking the official recognition of Druidism as a religion. (74,000 called themselves Druids in a recent census.) Historian Carole M. Cusack positions Druidism as one of the branches of the tree of Paganism and/or New Age-ism(s), which burst into all sorts of growth during the twentieth century. Modern Druidism fits into the smorgasbord of what Cusack calls the “deregulated spiritual marketplace” of our times.

But there’s a disconnect here. In the popular imagination, Stonehenge and Druidism now go together like tea and crumpets. Historically, Stonehenge, a product of Neolithic Britain, predates Caesar by thousands of years. It had nothing to do with Druids and certainly nothing to do with modern Druidism.

“The false association of [Stonehenge] with the Druids has persisted to the present day,” Cusak writes, “and has become a form of folklore or folk-memory that has enabled modern Druids to obtain access and a degree of respect in their interactions with Stonehenge and other megalithic sites.”

Meanwhile, archaeologists continue to explore the centuries of construction at Stonehenge and related sites like Durrington Walls and the Avenue that connects Stonehenge to the River Avon. Neolithic Britons seem to have come together to transform Stonehenge into the ring of giant stones—some from 180 miles away—we know today. Questions about construction and chronology continue, but current archeological thinking is dominated by findings and analyses of the Stonehenge Riverside Project of 2004–2009. The Stonehenge Riverside Project’s surveys and excavations made up the first major archeological explorations of Stonehenge and surroundings since the 1980s. The project archaeologists postulate that Stonehenge was a long-term cemetery for cremated remains, with Durrington Walls serving as the residencies and feasting center for its builders.

The hippie-turned-New Age movements birthed in the 1960s and 1970s resulted in a surge of interest in Stonehenge. Tens of thousands, not all of them Druids, attended the Stonehenge Free People’s Festival starting in 1974. In 1985, the festival was halted by English Heritage, the organization that maintains Stonehenge today, because of the crowds, disorder, and vandalism. Druids were also banned from performing rituals on site. However, English Heritage and the Druids soon came to an understanding: Druids could use the site as long as there was no associated festival.

So the clash of academic archaeology and what might be called folk archaeology comes into stark focus at Stonehenge.

Modern paganism is not without interest, of course, but continuing revelations about prehistory—whether of neolithic Britain or elsewhere—should be a lot more interesting. As are the techniques used to extract data from the past: an example used to telling effect by the Stonehenge Riverside Project is the analysis of lipid remains on pottery: we can tell if the pot held dairy products or the fat of ruminants or pigs, giving insights into the diet four thousand years ago. Another example: strontium isotope in bovine molars show that beef consumed at Durrington Walls was raised at least thirty miles away.

Of course, all this is not as photogenically mysterious/magical as robed Druids in the long shadows of a midwinter sunset. Academic archaeology, which suffers from charges of “elitism” in the reactionary populist politics of anti-intellectualism and anti-science, has a hard time competing with the popular irrationality of mysticism. Maybe the real Stonehenge needs more publicists.


Subscribe to !history@lemm.ee and !history@lemmy.ml

51

cross-posted from: https://lemm.ee/post/10945207

Long Read Review: Hitler’s American Model: The United States and the Making of Nazi Race Law by James Q. Whitman

*In Hitler’s American Model: The United States and the Making of Nazi Race Law, legal scholar James Q. Whitman examines how Nazi Germany looked to the model of the Jim Crow laws in the USA when formulating the Nuremberg Laws in the 1930s. This is a carefully researched and timely analysis of how racist ideology can penetrate the political and institutional fabric of societies, furthermore underscoring its continued impact in the USA today, writes Thomas Christie Williams. *

After the full horrors of Nazism were exposed at the end of World War II, eugenics – in Francis Galton’s words, the ‘science which deals with all influences that improve the inborn qualities of a race’ – as a social and scientific movement slowly faded from public view. The fact that Ronald Fisher, the founder of the modern discipline of genetics, and John Maynard Keynes, the economist whose ideas underpinned the New Deal, were active members of the Eugenics Society is now rarely discussed at Cambridge University, where they spent much of their academic careers. In 1954, the name of scientific journal the Annals of Eugenics was changed to the Annals of Human Genetics, and in 1965 the incoming recipient of the Chair of Eugenics at UCL, Harry Harris, became instead the Galton Professor of Human Genetics.

However, two groups of people have worked hard to keep memories of this great enthusiasm for a ‘scientific’ approach to institutionalised racism alive. The first are those who see understanding the history of the twentieth century as important, in order that we do not make the same mistakes again. They argue that whilst Nazism was the extreme end of the spectrum, it espoused views on nationality and race that were, if not mainstream, definitely recognised as acceptable by many sectors of society in Europe and the Americas. James Q. Whitman, author of Hitler’s American Model: The United States and the Making of Nazi Race Law, falls into this camp.

A legal scholar, Whitman identifies many commonalities between Nazi legislation in the early 1930s, which sought to exclude Jews from German public life, and the ‘Jim Crow’ laws enacted to exclude African Americans in the United States. Moving beyond commonalities, he argues that Nazi lawyers and the German public had a keen interest in US race law. As an example, he cites a 1936 article on racial policy in Neues Volk (New Volk), a propaganda newsletter from the National Socialist Office, which included a US map labelled ‘Statutory Restrictions on Negro Rights’, detailing disenfranchisement and anti-miscegenation laws in the 48 mainland US states.

The second group is the far-right movements arguably edging into the mainstream in the United States and Europe (in Hungary or Holland, for example). The chants of ‘Blood and Soil’ from the recent white supremacist rallies in Charlottesville, Virginia were an explicit reference to the Nazi ideal of ‘Blut und Boden’, and those gathered there are united by their fascination with fascist ideology and rhetoric. Vanguard America argues in its manifesto for an economy ‘free from the influence of international corporations, led by a rootless group of international Jews, which place profit beyond the interests of our people’. Membership of the Nationalist Socialist Movement (described on their website as ‘America’s Premier White Civil Rights Organization’) is ‘open to non-Semitic heterosexuals of European descent’, and a popular blogger for the alt-right, Mike Peinovich, who spoke at Charlottesville, hosts a chatshow entitled ‘The Daily Shoah’.

Hitler’s American Model is therefore a timely and sobering outline of how racist ideology can make its way into the political fabric of a country. It focuses on the changes introduced by Nazi lawyers post-1933, but we also learn much about how this developed in the United States. Whilst in the latter the case law excluding non-whites from public life developed over decades, in Nazi Germany the Nuremberg Laws were drafted and introduced in 1935, just two years after Hitler became Chancellor. Whitman’s main premise is that in this accelerated process, German lawyers and officials took inspiration and concrete guidance from legal practice across the Atlantic.

Reading the book, two sets of records stand out, one for their presence, and the other for their absence. The first is the stenographic report of a 5 June 1934 meeting of the Commission on Criminal Law Reform. Whitman’s twenty-page description of this transcript makes for gripping reading, and is the highlight of the book (94-113). The second is the lack of documentation regarding a September 1935 US study tour by 45 German lawyers (132). The trip was apparently a reward for their success in finalising the Nuremberg Race Laws, laid out by Hermann Göring at a rally only a few weeks earlier. As Dr. Heubner, chief of the Nazi Jurists’ Association, told the tour group before they left: ‘through this study trip the upholder of German law [will] gain the necessary compensation for an entire year of work’ (133). According to Whitman, historical record tells us that on arrival in New York at a reception organised by the New York City Bar Association, the group were met by a noisy demonstration lasting six hours and requiring police presence. However, in Whitman’s words: ‘sadly it does not seem possible to learn more about how […] the group fared on their study trip’. From the first set of records we learn much about how German lawyers saw their American counterparts; from the second (missing) set, we might have learnt more about how the American establishment viewed legal developments in the Third Reich.

Assembled at the 1934 meeting were seventeen lawyers and officials, and their brief was to respond to the demands of the Prussian Memorandum of September 1933. This document argued that the ‘task of the National Socialist State is to check the race-mixing that has been underway in Germany over the course of the centuries, and strive towards the goal of guaranteeing that Nordic blood, which is still determinative in the German people, should put its distinctive stamp on our life again’ (85). The final outcome of such meetings was the Nuremberg Laws, which consisted of three parts. The first, the Flag Law for the Reich, declared the swastika to be the only German national flag. The second, the Citizenship Laws, created a difference between German nationals – ‘any person who belongs to the mutual protection association of the German Reich’ – and the citizen – ‘a national of German blood’ who was the ‘sole bearer of full political rights’ (29). The third, the Nuremberg Blood Laws, made a criminal offence of marriage or extramarital sex between ‘Jews and nationals of German blood’ (31).

Whitman’s description of the 1934 meeting is gripping for a number of reasons. Firstly, it allows the opportunity to witness the mechanics of discrimination at work. We learn how a group of highly educated professionals – civil servants, legal academics, medical doctors – came together to formulate a set of profoundly exclusionary and undemocratic laws. The committee was faced with a number of questions. How could one define race in legal terms? Could it be possible to criminalise an act (in this case, sexual relations between a German and a Jew) to which two competent parties had consented? Secondly, as a non-American, it further underscores the deeply institutionalised discrimination within US law at this time, belying the idea that a supposedly independent judiciary can act to protect the rights of all citizens.

In Whitman’s interpretation, two groups were pitted against each other at the 1934 meeting. The first were juristic moderates, who felt that a policy of criminalising German and Jewish sexual relations was not in keeping with the German legal tradition. German criminal law, they argued, was based on clear and unambiguous concepts (105). Race, and in particular Jewishness, was difficult to ‘scientifically’ define (105); judges could not be expected to convict on the basis of vague concepts. Their adversaries were Nazi radicals, who argued that a new Criminal Code should be drawn up using the ‘fundamental principles of National Socialism’ (96). According to Whitman, it was these radicals who championed American law, already touched on in the Prussian Memorandum.

As it turns out, the American approach to defining race was not greatly troubled by the absence of a scientific conceptualisation. For the Nazi radicals, this was a heartening example. Roland Freisler, a State Secretary attached to the Ministry of Justice, pointed out: ‘How have they gone about doing this [defining race]? They have used different means. Several states have simply employed geographical concepts […] others have conflated matters, combining geographical origin with their conception of a particular circle of blood relatedness’ (107). Freisler continued:

they name the races in some more primitive way […] and therefore I am of the opinion that we can proceed with the same primitivity that is used by these American states (109).

Contrary to established German tradition, Nazi radicals believed that judges should be given freedom to institute racist legislation, without the need to come up with a scientifically satisfactory definition of race.

It is hard to argue with Whitman’s assertion that Nazi jurists and policymakers took a sustained interest in American race law, and that this helped shape the legal and political climate that led to the promulgation of the Nuremberg Laws. What Whitman moves on to in his conclusion is the extent to which the American legal and political system as a whole, beyond Jim Crow, was permeated with racism: laws related to race-based immigration, race-based citizenship and race-based anti-miscegenation. He makes the unsettling argument that America and Nazi Germany were united by a strong egalitarian, if not libertarian (in the Nazi case), ethos. This ethos, he argues, is that of all white men being equal, and thus it was not surprising that Nazism – in Whitman’s view an egalitarian social revolution for those self-defining as of German origin – turned to America for inspiration. As Whitman points out, white supremacy has a long history in the US, from 1691 when Virginia adopted the first anti-miscegenation statute, to 1790, when the First Congress opened naturalisation to ‘any alien, being a free white person’ (145), to the anti-immigration laws that followed the San Francisco Gold Rush and the segregation laws that followed the Civil War. In the wake of the Charlottesville protests, he would probably argue against Senator John McCain’s assertionthat ‘white supremacists and neo-Nazis are, by definition, opposed to American patriotism and the ideals that define us as a people and make our nation special’.

Whitman also questions whether the US common law system really serves to protect the freedom of individuals against an over-reaching state. He points out that the Nazis, rather than taking over the pre-existing German civil law system, reformed it according to a common law model. Nazi officials were given discretion to act in what they believed to be the ‘spirit of Hitler’ (149), brushing aside the legal scientific tradition of the moderates of the 1934 meeting. He argues that when it came to race, American ‘legal science’ tended to yield to American politics and left much racist legislation untouched.

So where does that leave the ‘science’ of eugenics, and the ‘legal science’ of the jurists working in a civil code system? Does a logically consistent approach of any kind protect individual liberties, or rather open up a way to discriminate based on supposedly objective measures? An important point, not explicitly made by Whitman but implicit throughout the book, is that the supposed objectivity of a scientific approach (whether in biology or the law) can easily be misused by those whose aims are clearly undemocratic and unegalitarian. On ‘The Daily Shoah’ and other racist websites, substantial discussion is devoted to ‘metrics’ related to, for example, race and IQ or sexual orientation and the chance of conviction for paedophile offences.

The Charlottesville protests were sparked by the decision to remove a statue of Robert E. Lee, a Confederate General in the Civil War: proponents of the removal argued that it served as a monument to white supremacy. Conversely, in the United Kingdom, a similar controversy surrounding a petition to remove Cecil Rhodes’s statue in Oriel College Oxford failed to lead to its removal, and the Galton Institute in London (which acknowledges its founding as the Eugenics Education Society in 1907, but disassociates itself from any interest in the theory and practice of eugenics) continues to fund research and award essay prizes on genetics for A Level students. Clearly retaining the material legacy of historical figures runs the risk of allowing their glorification (as in Charlottesville), whitewashing or suggesting implicit sanction of their actions.

However, in Whitman’s view, to try to forget or ignore these figures and their ongoing influence on society today is the more dangerous option. Hitler’s American Model is a thoughtful and carefully researched account of how the legal community in the US and Germany proved ‘incapable of staving off the dangers of the politicization of criminal law’ (159). He worries that:

the story in this book […] is not done yet […] what Roland Freisler saw, and admired, in American race law eighty years ago is still with us in the politics of American criminal justice (160).

Given recent developments in American politics, this should perhaps give us all pause for thought.


Subscribe to !history@lemm.ee :)

2
submitted 1 year ago* (last edited 1 year ago) by Historical_General@lemm.ee to c/fantasy@lemmy.ml

When Britain was gripped by 'fairy mania'

"Fairycore" may be trending on social media today but 100 years ago supernatural sprites were a national obsession. Holly Williams explores fairy fever.

Imagine a fairy. Is the picture that appears in your mind's eye a tiny, pretty, magical figure – a childish wisp with insect-like wings and a dress made of petals?

If so, it's likely you've been influenced by Cicely Mary Barker, the British illustrator who created the Flower Fairies. 2023 marks 100 years since the publication of her first book of poems and pictures, Flower Fairies of the Spring – an anniversary currently being celebrated in an exhibition at the Lady Lever Gallery in Merseyside, UK.

The Flower Fairies' influence has endured: they have never been out of print, and continue to be popular around the world – big in Japan and in Italy, where Gucci released a children's range featuring Barker's prints in 2022. Billie Eilish recently had Flower Fairies tattooed on her hand, while their whimsical, floral aesthetic can be seen in the TikTok "fairycore" trend.

image

(An exhibition at the Lady Lever Art Gallery explores the Flower Fairies phenomenon, and features pantomime costumes (Credit: Pete Carr))

Barker's delicate watercolours certainly helped cement several tropes we now consider classic – almost essential, in fact – in the iconography of the fairy: they are miniature, sweet and youthful, they are intertwined with plants and the natural world, and they are distinctly twee. Yet her drawings were also "firmly footed in realism" points out Fiona Slattery Clark, curator of the show. "The children were all painted from life [and] her plants and flowers are as realistic as possible." Barker drew children from the nursery school her sister ran in their house in Croydon near London; each was assigned a flower or tree, and Barker's detailed illustrations were botanically accurate – she would source samples from Kew Gardens, says Slattery Clark. Even the petal-like wings and fairy outfits were closely based on plants: an acorn cup becoming a jaunty cap, a harebell becoming a prettily scalloped skirt.

For many hundreds of years, fairies were not necessarily tiny and fey, but grotesque or fierce elemental forces

The Flower Fairies were an immediate hit – but Barker was far from the only artist of her era to find success with fairies. In fact, fairy fever swelled within the United Kingdom for over half a century, reaching something of a peak around the time the Flower Fairies emerged in 1923. Over 350 fairy books were published in the UK between 1920 and 1925, including in Enid Blyton's first fairy foray, a collection of poems called Real Fairies in 1923. Fairy art even had the stamp of royal approval: Queen Mary was a fan of Ida Rentoul Outhwaite's ethereal drawings, and helped popularise them by sending them in postcard form.

Fairies have long been with us – in our imaginations, at least. But for many hundreds of years, they were not necessarily tiny and fey, but grotesque or fierce elemental forces, capable of great darkness. "In 1800, if you thought your child was a fairy it would have been like demonic possession – you would have put that child in the fire to drive out the fairy," points out Alice Sage, a curator and historian.

image

(Each of Barker's fairies corresponded to a plant, tree or flower – pictured, the Silver Birch Fairy (Credit: Estate of Cicely Mary Barker 1934 Flower Fairies))

Yet within 100 years, the whole conception of fairies completely changed. "Throughout the 19th Century, fairies became increasingly miniaturised, sapped of their power – trapped in the nursery," says Sage. As the Victorian era progressed, they are increasingly associated with childhood; as their popularity grew, they shrank.

But first, fairies became a fashionable subject for Victorian artists, often taking inspiration from Shakespeare's A Midsummer Night's Dream and The Tempest. John Anster Fitzgerald, Edwin Landseer, John Everett Millais, Joseph Noel Paton, Arthur Rackham and even JMW Turner – among many others – painted supernatural sprites from the 1840s onwards. But there was still a sense of otherworldly strangeness in many of their depictions – as seen in the work of Richard Dadd, who made his hyper-intricate fairy paintings while living in a Victorian asylum after killing his father.

Then two wider cultural developments came along that changed fairy reputations forever. One was that "children's literature happened", says Sage. The Victorians promoted the idea of childhood as a time of innocence, requiring its own entertainment. Illustrated children's books really took off from the 1870s, with fairies a staple, and increasingly cutesy, feature. The second was pantomime. "Every Victorian pantomime would have this big spectacle of transformation at the end, where children dressed as fairies filled the stage," says Sage. The standard fairy fancy dress outfit today is basically the same as what these Victorian children would have worn: think tinsel, sparkly sequins, and translucent, gauzy wings.

Huge popularity

Moving into the 20th Century, fairies showed few signs of buzzing off – if anything, they cemented their place. "In the Edwardian era, Peter Pan started to be performed [in 1904], and that carried on for the next 25 years," points out Slattery Clark – enough time for several generations of children to learn to clap their hands to show they believe in fairies.

And as the new century lurched through global upheaval via World War One, fairy mania continued – if anything, widening and deepening. "That golden age of children's literature is really an upper middle-class phenomenon," points out Sage. "What happened from World War One onwards is it explodes beyond that, and becomes an adult concern."

image

(The costumes displayed in the exhibition are based on the Flower Fairies illustrations (Credit: Pete Carr))

Having been whisked from the woods into the nursery, fairies then made their way to troubled adults on the battlefield or waiting at home. Consider the huge popularity of a print, Piper of Dreams by Estella Canziani, during World War One: a wispy image of a man playing a pipe and surrounded by tiny fairies, it sold a staggering quarter of a million copies in just 1916.

"It's about belief and it's about hope – that's what fairies represent in that time," says Sage. "The supernatural becomes a way of finding some luck and brightness, [when] people don't have control over their lives, their future, their families."

For Conan Doyle, it was all about a search for another realm of being that related to life after death, vibrations, telepathy, telekinesis – Alice Sage

Today, we associate fairies with little girls – but this was an era when fairy art was popular with grown men, too. And technology helped spread it: there was an explosion in sending postcards around this time. They were cheap to buy, and free to post to a serving soldier abroad. "Suddenly everyone can send fairies, and they're flying through the air and across the seas. You can’t underestimate the practical aspect of it," says Sage.

Indeed, Barker herself cut her teeth illustrating such postcards: she produced a patriotic series showing "Children of the Allies", in different forms of national dress, in 1915, followed by a series of characters from Shakespeare, before teasing the Flower Fairies with a set of "Fairies and Elves" postcards in 1918.

Barker never made any claims for fairies being real – "I have never seen a fairy", she wrote in a foreword to Flower Fairies of the Wayside. But it is worth noting that she first published the Flower Fairies at a moment when the desire to believe in magical beings was at a rare high. In 1920, Britain was gripped by the story of the Cottingley Fairies, after two girls claimed to have photographed fairies at the bottom of their garden in West Yorkshire – and were widely believed.

Their beautiful photographs were created by paper cut-outs, floating on hat pins. Although many were sceptical, they nonetheless also fooled many of the great and the good – the photographs were brought to prominence by no less than Sir Arthur Conan Doyle, the author of Sherlock Holmes, who wrote a whole book about it, The Coming of the Fairies, in 1922.

image

(The Crocus Fairies from Flower Fairies of the Spring – the watercolours are still popular today with “fairycore” fans (Credit: Estate of Cicely Mary Barker 1934 Flower Fairies))

Cousins Elsie Wright and Frances Griffiths were aged 16 and nine when they took the first photos. Many years later, in the 1980s, they admitted it was a hoax, explaining that they kept up the pretence that the fairies were real a because they felt sorry for the middle-aged men, like Conan Doyle, that so wanted to believe. There was, at the time, a serious resurgence in spiritualism in the UK, with seances and attempts to contact the dead proving understandably tempting for the bereaved. Conan Doyle himself became interested in a spirit world after his son died in the war. And for believers, this wasn't "woo-woo" nonsense – it was supposedly based in science. After all, scientific advances were genuinely explaining hitherto unknown and invisible aspects of our world.

"For Conan Doyle, it was all about a search for another realm of being that related to life after death, vibrations, telepathy, telekinesis – this fascinating world on the edge of the limits of human perception," says Sage. "And obviously that's connected to the loss of his son in World War One."

Like the Flower Fairies, the Cottingley photographs further reinforced the association between children and fairies, as well as cementing what a fairy looked like in the public consciousness. Yet aside from Tinkerbell, Flower Fairies are probably the only image from the fairy-fever era still instantly recognisable today. Why, of all the fairy content out there, have Barker's images endured so strongly over the past 100 years?

"They were [originally published] in full colour, and a lot of books were published in black and white," begins Sage. What looked novel at the time, now seems charmingly period – but the delicacy, intricacy, and imagination of Barker's pictures can still cast a spell. "It's like dolls houses – things that are very miniaturised, but very detailed and realistic, scratch a certain itch," suggests Sage. "They are absolutely beautiful, which helps."

"It's a real celebration of nature – there is a strong educational aspect to her work," puts forward Slattery Clark, emphasising the botanical accuracy of Barker’s drawings. The educational argument might sound absurd given we're discussing fairy art, but as a child who was obsessed with Flower Fairies, I can attest to the truth of it: all the wildflowers I know the names of I learned from these books.

image

(Cicely Mary Barker's exquisite illustrations were hugely popular in the 1920s (Credit: Estate of Cicely Mary Barker))

Having each fairy very specifically related to a particular plant was also commercially canny – whether Barker intended this or not, it created space for identification, for collectability, for a kind of innate brand franchising. "In children's culture, we create series that are collectable, that you identify with… It's like Pokemon or something!" laughs Sage. "When I speak to people about the Flower Fairies, especially groups of sisters, it's always 'which one were you?'"

Still, Sage is pleased to see the Flower Fairies exhibited in a fine art context at the Lady Lever gallery. For a long time, men painting fairies has been considered art – but when women do it, it's just silly flowery stuff for children.

"This is fine art – it's mass, popular fine art," insists Sage. "I think a lot of the diminishment of fairies and children's illustration is from a misogynist, snobbish and elitist art historical tradition. I'm so excited to see this kind of exhibition, that reclaims this history." Consider this a beating of wings, then, that takes fairies back out of the nursery – and into the gallery.

Flower Fairies is at the Lady Lever Art Gallery, Port Sunlight Village, UK until 5 November.

Holly Williams' novel What Time is Love? is out in paperback now.___

31

cross-posted from: https://lemm.ee/post/10358195

The road from Rome

The fall of the Roman Empire wasn’t a tragedy for civilisation. It was a lucky break for humanity as a whole

For an empire that collapsed more than 1,500 years ago, ancient Rome maintains a powerful presence. About 1 billion people speak languages derived from Latin; Roman law shapes modern norms; and Roman architecture has been widely imitated. Christianity, which the empire embraced in its sunset years, remains the world’s largest religion. Yet all these enduring influences pale against Rome’s most important legacy: its fall. Had its empire not unravelled, or had it been replaced by a similarly overpowering successor, the world wouldn’t have become modern.

This isn’t the way that we ordinarily think about an event that has been lamented pretty much ever since it happened. In the late 18th century, in his monumental work The History of the Decline and Fall of the Roman Empire (1776-1788), the British historian Edward Gibbon called it ‘the greatest, perhaps, and most awful scene in the history of mankind’. Tankloads of ink have been expended on explaining it. Back in 1984, the German historian Alexander Demandt patiently compiled no fewer than 210 different reasons for Rome’s demise that had been put forward over time. And the flood of books and papers shows no sign of abating: most recently, disease and climate change have been pressed into service. Wouldn’t only a calamity of the first order warrant this kind of attention?

It’s true that Rome’s collapse reverberated widely, at least in the western – mostly European – half of its empire. (A shrinking portion of the eastern half, later known as Byzantium, survived for another millennium.) Although some regions were harder hit than others, none escaped unscathed. Monumental structures fell into disrepair; previously thriving cities emptied out; Rome itself turned into a shadow of its former grand self, with shepherds tending their flocks among the ruins. Trade and coin use thinned out, and the art of writing retreated. Population numbers plummeted.

But a few benefits were already being felt at the time. Roman power had fostered immense inequality: its collapse brought down the plutocratic ruling class, releasing the labouring masses from oppressive exploitation. The new Germanic rulers operated with lower overheads and proved less adept at collecting rents and taxes. Forensic archaeology reveals that people grew to be taller, likely thanks to reduced inequality, a better diet and lower disease loads. Yet these changes didn’t last.

The real payoff of Rome’s demise took much longer to emerge. When Goths, Vandals, Franks, Lombards and Anglo-Saxons carved up the empire, they broke the imperial order so thoroughly that it never returned. Their 5th-century takeover was only the beginning: in a very real sense, Rome’s decline continued well after its fall – turning Gibbon’s title on its head. When the Germans took charge, they initially relied on Roman institutions of governance to run their new kingdoms. But they did a poor job of maintaining that vital infrastructure. Before long, nobles and warriors made themselves at home on the lands whose yield kings had assigned to them. While this relieved rulers of the onerous need to count and tax the peasantry, it also starved them of revenue and made it harder for them to control their supporters.

When, in the year 800, the Frankish king Charlemagne decided that he was a new Roman emperor, it was already too late. In the following centuries, royal power declined as aristocrats asserted ever greater autonomy and knights set up their own castles. The Holy Roman Empire, established in Germany and northern Italy in 962, never properly functioned as a unified state. For much of the Middle Ages, power was widely dispersed among different groups. Kings claimed political supremacy but often found it hard to exercise control beyond their own domains. Nobles and their armed vassals wielded the bulk of military power. The Catholic Church, increasingly centralised under an ascendant papacy, had a lock on the dominant belief system. Bishops and abbots cooperated with secular authorities, but carefully guarded their prerogatives. Economic power was concentrated among feudal lords and in autonomous cities dominated by assertive associations of artisans and merchants.


Read more through the link. And join lemm.ee/c/history

9

cross-posted from: https://lemm.ee/post/10358195

The road from Rome

The fall of the Roman Empire wasn’t a tragedy for civilisation. It was a lucky break for humanity as a whole

For an empire that collapsed more than 1,500 years ago, ancient Rome maintains a powerful presence. About 1 billion people speak languages derived from Latin; Roman law shapes modern norms; and Roman architecture has been widely imitated. Christianity, which the empire embraced in its sunset years, remains the world’s largest religion. Yet all these enduring influences pale against Rome’s most important legacy: its fall. Had its empire not unravelled, or had it been replaced by a similarly overpowering successor, the world wouldn’t have become modern.

This isn’t the way that we ordinarily think about an event that has been lamented pretty much ever since it happened. In the late 18th century, in his monumental work The History of the Decline and Fall of the Roman Empire (1776-1788), the British historian Edward Gibbon called it ‘the greatest, perhaps, and most awful scene in the history of mankind’. Tankloads of ink have been expended on explaining it. Back in 1984, the German historian Alexander Demandt patiently compiled no fewer than 210 different reasons for Rome’s demise that had been put forward over time. And the flood of books and papers shows no sign of abating: most recently, disease and climate change have been pressed into service. Wouldn’t only a calamity of the first order warrant this kind of attention?

It’s true that Rome’s collapse reverberated widely, at least in the western – mostly European – half of its empire. (A shrinking portion of the eastern half, later known as Byzantium, survived for another millennium.) Although some regions were harder hit than others, none escaped unscathed. Monumental structures fell into disrepair; previously thriving cities emptied out; Rome itself turned into a shadow of its former grand self, with shepherds tending their flocks among the ruins. Trade and coin use thinned out, and the art of writing retreated. Population numbers plummeted.

But a few benefits were already being felt at the time. Roman power had fostered immense inequality: its collapse brought down the plutocratic ruling class, releasing the labouring masses from oppressive exploitation. The new Germanic rulers operated with lower overheads and proved less adept at collecting rents and taxes. Forensic archaeology reveals that people grew to be taller, likely thanks to reduced inequality, a better diet and lower disease loads. Yet these changes didn’t last.

The real payoff of Rome’s demise took much longer to emerge. When Goths, Vandals, Franks, Lombards and Anglo-Saxons carved up the empire, they broke the imperial order so thoroughly that it never returned. Their 5th-century takeover was only the beginning: in a very real sense, Rome’s decline continued well after its fall – turning Gibbon’s title on its head. When the Germans took charge, they initially relied on Roman institutions of governance to run their new kingdoms. But they did a poor job of maintaining that vital infrastructure. Before long, nobles and warriors made themselves at home on the lands whose yield kings had assigned to them. While this relieved rulers of the onerous need to count and tax the peasantry, it also starved them of revenue and made it harder for them to control their supporters.

When, in the year 800, the Frankish king Charlemagne decided that he was a new Roman emperor, it was already too late. In the following centuries, royal power declined as aristocrats asserted ever greater autonomy and knights set up their own castles. The Holy Roman Empire, established in Germany and northern Italy in 962, never properly functioned as a unified state. For much of the Middle Ages, power was widely dispersed among different groups. Kings claimed political supremacy but often found it hard to exercise control beyond their own domains. Nobles and their armed vassals wielded the bulk of military power. The Catholic Church, increasingly centralised under an ascendant papacy, had a lock on the dominant belief system. Bishops and abbots cooperated with secular authorities, but carefully guarded their prerogatives. Economic power was concentrated among feudal lords and in autonomous cities dominated by assertive associations of artisans and merchants.


Read more through the link. And join lemm.ee/c/history

111

cross-posted from: https://lemm.ee/post/9176670

Tolkien couldn't stand cars, and his philosophy of embracing walking and biking might just be the key to a hobbit's happy and cheerful life.

Most of us who love the Lord of the Ringsbooks have felt the appeal of a hobbit’s life. These merry little folk live generally uncomplicated and joyful lives full of good cheer, song, good food and jolly (if sometimes nosey) community.

One of the most self-evident ways to live like a hobbit is also pretty counter cultural. It’s to ditch your car in favor of walking or biking to your destination instead.

Hobbits, and most of the good creatures in Lord of the Rings, consistently opt for a simpler and slower pace of life. Industrialization and polluting machinery in the series are consistently symbols of evil, embodied by Sauron and his orcs.

The series’ none-too-subtle rejection of industrialization reflects author J.R.R. Tolkien’s own personal views. After owning a car for a time when his four children were little (an experience that provided the inspiration for his little-known storybook Mr. Bliss), Tolkien sold the car and switched to a bicycle as a matter of principle.

A cautionary message from his writings is that “we must recognize the machine for what it is — a mere tool with the potential to enslave, against which we must be ever on guard.”

Tolkien’s loathing of motor-cars

It’s a little-known fact that Tolkien abhorred cars to an intense degree. As Tom Neas points out in a Geek Insider piece:

Tolkien did own a car for a short period of time. He purchased a Morris Cowley in 1932, which he named “Jo.” A few years later he replaced Jo with a new car, creatively named “Jo 2.”

Tolkien was not a good driver; on an early visit to his sister he knocked down part of a stone wall. However, he was brazen, speeding down Oxford streets with little concern for other drivers or pedestrians, crying “Charge ’em and they scatter!”

By the start of the second World War, Tolkien gave up driving, as rationing had begun. Around the same time, he noticed the damage that cars did to the landscape and never drove again, which gave rise to his more well-known negative views on cars.

Cars destroyed peace and beauty, he felt, and made life less pleasant all around. He referred to a car’s motor as the “infernal combustion engine,” as in a letter to his son in 1944,

It is full Maytime by the trees and grass now. But the heavens are full of roar and riot. You cannot even hold a shouting conversation in the garden now, save about 1 a.m. and 7 p.m. – unless the day is too foul to be out. How I wish the “infernal combustion” engine had never been invented. Or (more difficult still since humanity and engineers in special are both nitwitted and malicious as a rule) that it could have been put to rational uses—if any.

Cars symbolize “spirit of Isengard”

I don’t think it’s a coincidence that he named the villain in The Hobbit Smaug, like the smog of factories and machines.

He referred to “destroying Oxford in order to accommodate motor-cars” as an example of “the spirit of ‘Isengard,’ if not of Mordor.”

Harsh words, but does Tolkien perhaps have a point? Besides the congestion, traffic, and commotion cars cause, they can distance us from our neighbors, removing opportunities for casual daily interactions that bring so much happiness.

Loads of research support the mental and physical health benefits of walking and biking (check out the Netflix documentary Live to 100: Secrets of the Blue Zones for more!).

If you’ve ever wanted to live like a hobbit, maybe you can find an opportunity to ditch the car for an outing this week. Any chance that destination is close enough to walk or bike instead? And who knows, perhaps this outing will lead you to a memorable adventure you never would have found otherwise.

1
  • Dune: Part Two is set for a November 3, 2023 release with a six-week IMAX run, showing potential changes in the movie industry, favoring non-superhero films.
  • IMAX's prioritization of Dune: Part Two over Disney's The Marvels suggests a shift in the industry towards storytelling and inclusion, offering more opportunities for diverse genres and impacting future Hollywood releases.

Denis Villeneuve's much-anticipated Dune: Part Two releases on November, 3, 2023, with a six-week IMAX run, indicating a potentially good change in the movie industry. With Marvel's next theatrical release, The Marvels, scheduled to hit theaters a week later, there was an expectation that Dune: Part Two may be moved to next year to accommodate this. However, IMAX CEO Richard Gelfond confirmed Dune: Part Two won't be delayed.

Dune: Part Two's predecessor Dune was a box-office hit and already proved its lucrative nature as a franchise, with about 20% of the earnings coming from IMAX alone. Not to mention, this success was in 2021 when there were still hesitations about public movie-going as the COVID-19 pandemic was a larger concern. Without the setbacks of the pandemic, and an even longer IMAX run planned, Dune: Part Two could make even more money, indicating how the IMAX experience is not just for superheroes and Disney.

Dune 2 Is Taking Priority In IMAX Over Disney

Richard Gelfond put rumors to bed, confirming Dune: Part Two's IMAX release is on track. In the Q2 IMAX earnings call, transcribed by The Motley Fool, Gelfond explains why a postponement won't happen. "Dune [Part Two] is already in the midst of a marketing campaign. There are trailers out. There are lots of materials out," he said. Gelfond elaborates that a postponement would cost extra money since all that promoting would need to be redone whenever a new date was picked.

While Gelfond acknowledges that The Marvels is a great option to have on standby since it's Marvel, he solidifies that Dune: Part Two takes priority since moving its date would make its competition unclear, potentially damaging box-office sales. Additionally, with The Marvels having limited promotional content so far, mostly due to the SAG-AFTRA joining the WGA strike, the shift away from superhero movies as an automatic IMAX release may not be surprising. The most recent Disney and Marvel releases, while performing well at the box office, don't compare to the speculation that Dune: Part Two could beat Dune's box office success.

IMAX's Dune 2 Plan Could Show The Future For The Whole Industry

Marvel still holds a high standing when it comes to cinematic releases. For instance, The Marvels' prequel Captain Marvel, released in 2019 and grossed just over Dune's mid-pandemic income. This suggests that superhero films while facing potential postponements or less turnout, might have some work to do. As James Gunn, who has active involvement in both Marvel and DC, shares on the podcast Inside of You with Michael Rosenbaum, movies need to be about story, less about over-saturated capitalizing on the superhero genre, and should answer the question, "What makes this story different that it fills a need for people in theaters to go see?"

For the industry as a whole, IMAX's favoring of a sci-fi sequel over a Marvel movie indicates perhaps what Gunn is hoping for. By prioritizing films outside the superhero genre, it gives more space for a focus to be on story and inclusion rather than spectacle and action. It seems IMAX echoes this sentiment since Gelfond explains that the IMAX slate for 2023 includes 10 more local language films than originally estimated. Dune: Part Two, while still an English film, opens the door for more diversity in genre, at the very least, across IMAX screens, which could impact how future Hollywood movies get released.

2
submitted 1 year ago* (last edited 1 year ago) by Historical_General@lemm.ee to c/keepwriting@lemmy.world

I was inspired by S. Horwitz who transcribed Rowling's post of her outline of OoP and posted about it on his site/blog. And I quite liked this Grid.

He calls it an 'Expanded Series Grid', and was seemingly upset when Rowling described her chapter action sumaries on the grid as 'plot'. He calls them 'series' which is not something I'm familiar with, but his advice seems pretty good so I'll take his word for it.

I started by using MS Word, to open a document, go to layout and set margins to narrow. And orientation to landscape.

edit: I forgot to put in instructions for Libre Writer: for Writer, go to Format> Page Style> Page, and click landscape and halve the margins.

I took the image and ran it through an online service (extractimage) to get a usable table. It was mostly correct, so I copied the result and pasted it in.

The result is pretty good method of compiling factional subplots, timing and the resulting chapter summaries imo. if you like I can update you on how my own personal efforts go.

No. Time Title Plot Prophecy Cho / Ginny Order of the Phoenix Dumbledore's Army
~~13~~ 16 Oct ~~Plots and Resistance~~ In the Hog’s Head Harry has a lesson scheduled with Snape, but he skips it to go to Hogsmeade with Ron and Hermione..._ Harry sees the Hall of Prophecy (in a dream). Voldemort is still formulating his plans;... Cho is in Hogsmeade and wants to join Dumbledore's Army. Tonks and Lupin Recruiting

A template to get your table started.

:)

view more: next ›

Historical_General

joined 1 year ago
MODERATOR OF