Viewing archives for Research

The College’s Randall-MacIver studentship was established in 1964 after the death of the widow of former student, Egyptologist and archaeologist David Randall-MacIver. We spoke to our current Randall-MacIver student Jennice Singh about her research into the archaeology of trees.

Can you tell us a bit about your research area, which examines trees in New Zealand using a method of dating using oxygen isotopes?

My research focuses on applying oxygen isotope dendrochronology to native New Zealand trees. These species are culturally and archaeologically significant, particularly Matai and Miro, which Māori communities used extensively for crafting canoes and constructing fortified settlements, known as Pā. Unfortunately, traditional methods like radiocarbon dating and ring-width dendrochronology often fail with these trees due to growth anomalies and calibration challenges in the Southern Hemisphere. By developing an oxygen isotope time series, my work aims to establish a robust framework for dating archaeological timbers, bridging a critical gap in understanding Māori and European heritage as preserved in timber archives. I am deeply grateful for the Randall-MacIver Scholarship and Queen’s College that supports this pioneering research, allowing me to contribute to both archaeological science and cultural understanding.

Why is this your method of choice?

Oxygen isotope dendrochronology has been highly effective in the UK, especially with oak and elm, where it has provided precise dates even when ring-width dendrochronology struggled. By focusing on δ¹⁸O variations influenced by summer precipitation, this method overcomes issues like growth irregularities and short samples. Its success with oak inspires confidence that it can address similar challenges with trees in New Zealand, offering a reliable framework for dating culturally significant timbers.

What questions about past climates are you hoping to answer with your work?

While the primary goal of my research is to refine timelines for archaeological applications, the isotopic data encoded in tree rings can indirectly offer insights into rainfall and temperature variability over the last 800 years, a period that covers the timeline of human settlement in New Zealand. These patterns can enhance our understanding of how environmental conditions influenced Māori interactions with their environment and broader historical events.

What inspired you to explore this area?

Growing up amidst the mountains of Nainital, India, surrounded by towering pine and ancient chestnut trees, I developed a deep fascination with trees and their histories. These early experiences, combined with listening to folklore, naturally evolved into a passion for studying tree rings. During my undergraduate studies in archaeology, I became interested in understanding how past societies interacted with climatic changes. However, it was during my MSc in Archaeological Science at Oxford that I became specifically drawn to isotope dendrochronology as a method for exploring both chronological and environmental dimensions of the past.

I became interested in understanding how past societies interacted with climatic changes

What are the key challenges that you face in your research?

Some challenges include difficulties in using ring-width dendrochronology on fast-growing species like Matai and Miro, which often exhibit growth anomalies and resin pooling. Additionally, obtaining samples from historic buildings and artefacts is challenging due to preservation issues. In New Zealand, the protected status of these species necessitates working collaboratively with locals to ensure cultural sensitivity and legal compliance during sampling.

What do you enjoy about being based at Queen’s?

Queen’s offers a wonderfully supportive academic environment and a vibrant community. The book grant, available to all Queen’s students, is a commendable initiative that has been particularly beneficial. Every interaction, from the friendly smiles at the Lodge to the chef’s creative desserts, adds a special charm to college life.

Can you recommend a book?

In terms of fiction, I feel there’s an underappreciated richness in Asian literature, where emotions are explored with a rawness that’s truly captivating. The Face of Another by Kōbō Abe, is a Japanese classic that delves into the topics of identity, human nature, and the psychology of isolation. Another favourite is Please Look After Mom by Korean author Kyung-sook Shin. This novel is an emotional excavation, tenderly and subtly peeling back layers of unexpressed love and regrets that lie beneath the surface of family life, revealing the haunting echoes of missed opportunities.

                                                                                                                                                                                                      

A long core that Jennice used during a demonstration with the Oxford Dendrochronology Unit at Weston Library. This tree was probably born around 1544 and died in 2020, offering a fascinating perspective on history, with significant events marked along its timeline.
A long core that Jennice used during a demonstration with the Oxford Dendrochronology Unit at Weston Library. This tree was probably born around 1544 and died in 2020, offering a fascinating perspective on history, with significant events marked along its timeline.

We spoke to new Fellow in Physics Dr Kirsty Duffy about her research into neutrinos: the most important particles you’ve never heard of.

What first fascinated you about physics?

The first time I got interested in Physics is when I was very young.  I had a book called The Big Book of Incredible Facts which taught me things like a cockroach can live for a year without a head and that someone had once survived falling from an aeroplane with no parachute because they hit trees on the way down and landed in a snow drift.  One of the other things I read in there was the fact that atoms are made of smaller particles called protons, neutrons, and electrons and I strongly remember a teacher subsequently telling us that everything was made out of atoms and that these were the smallest things, and I knew that this wasn’t the case!  Since then, I have always been fascinated by how things work and wanted to understand how everything works and fits together. To me, Physics is the underlying thing that explains everything else.

Your research in particle physics focuses on neutrinos.  Can you explain what kinds of questions you ask and why they are interesting and important?

Neutrinos are the most important particles you’ve never heard of. They are absolutely everywhere; they are the most abundant particle in the universe after photons (particles of light). Neutrinos are produced in nuclear reactions so in the sun, in the centre of the earth, in nuclear reactors, in particle accelerators, and also in things like bananas that contain a lot of potassium, which is radioactive.  What is interesting about neutrinos is that despite their abundance, they almost never interact and when they do, it’s only weakly (literally – via the weak force!).  About a hundred million neutrinos will go through your thumbnail every second and these come almost entirely from the sun.  They actually go through your body all the time but in your lifetime, on average, only one neutrino will actually hit an atom in your body and interact. This makes them very difficult to study because when they don’t interact, it’s impossible to see them. Consequently, they are one of the particles about which we know the least.

The biggest question that my work is trying to answer with neutrinos is why the universe exists. Scientists think that in the big bang equal amounts of matter and anti-matter were created. Physics is all about symmetry so it makes sense that you would have the same amount of each, but the question is, if there were equal amounts in the beginning, why do we now have a universe which, as far as we can tell, is only made of matter?  The answer has to be that there must be some difference in the physics of how matter and anti-matter behave such that all of the anti-matter disappeared.  We can calculate that we need about a one in a million difference – that is, if you had about a million and one particles of matter and a million particles of anti-matter, a million could annihilate and just one particle left over is enough to create the universe.  What we need is some physics that will create a one in a million difference. We have already measured differences in matter and anti-matter in particles called quarks in the Large Hadron Collider at CERN and we have seen some differences, but it’s very small and not enough to explain the amount of universe that exists.  Our next guess, therefore, lies with neutrinos.  The experiments that I’m doing are trying to see whether we can measure the difference between neutrinos and the anti-matter version, called anti-neutrinos.

What is your favourite fact about neutrinos?

One really interesting thing about neutrinos is that they come in three types that we call flavours, and it was discovered fairly recently (in academic terms) that they can change type from one flavour to another. This was very unexpected, and in fact the 2015 Nobel Prize was given to two of the people who discovered and proved this. Now the work that I am doing is to try to understand the mechanism of how they change.  We create neutrinos in a controlled environment, in particle accelerators, and they are about 99% one type and then we send them over a long distance and there is some probability that over that distance they will change into one of the other two types.  Because neutrinos almost never interact, you don’t have to dig a tunnel to send them over this distance, you can just fire them straight into the ground. They will travel through the ground over 100s of kilometres, during which time they will change, and then at the other end we measure them again and look for the new types.  Specifically, if we can do this with neutrinos and then do it with anti-neutrinos, we can examine the differences in the way that they change to help us understand that fundamental difference between matter and anti-matter.

What is your role in the MicroBooNE collaboration and what does this project investigate?

I am Physics Coordinator for the MicroBooNE, which is the lead scientist position in a collaboration of around 180 people. I work with people from many different countries and institutions to further the same goal.  My role involves setting the goals and managing all the different interests that people have to produce our collaboration-wide results.

MicroBooNE is the first in a programme of experiments in the US where we use a particular kind of detector called a liquid argon detector. We have a huge tank of liquid argon which is at minus 186 degrees Celsius, and we use this to try and see neutrinos.  What happens is that when a neutrino comes in, if we’re lucky, it interacts with an atom of argon and it produces charged particles.  As those charged particles travel through the detector, they ionise the argon leaving a little trail of electrons where they have been.  We put an electric field over the whole experiment so that the electrons drift to one side and we measure them with wires. The key thing about this technology is the incredible precision:  we can put the wires three millimetres apart and get a three-millimetre pixel size over a detector that’s about the size of a single-decker bus.

This US-based programme, of which MicroBooNE is the first, will eventually culminate in an experiment called DUNE which will have four detectors, each the size of a Dreamliner plane. DUNE will be able to conduct large scale observations of the flavour-changing matter/anti-matter neutrinos. So MicroBooNE’s role is therefore partly to demonstrate the technology for this.  What I am particularly interested in is seeing how neutrinos interact in the detector because no one has ever seen a neutrino directly. Even in the best particle detector we’ve ever built, we can only see the particles that have been produced or affected by the neutrino, not the neutrino itself.  This means we have to try and backtrack from what we measure to understand the role that the neutrino must have played.  It’s a difficult problem because we don’t fully understand the nuclear physics of how neutrinos interact with particles inside an atom, and there’s a lot of potential for misunderstanding.

What excites you about your work?

The thing that is most exciting is the fact that we are learning new things that no one knew before. There is the potential to discover something really big about how the universe evolved but even if we don’t, every measurement that we do is something that hasn’t been done before and is adding new knowledge to the world.

We are learning new things that no one knew before.

What are you looking forward to about being at Queen’s?

I’m really looking forward to getting to know people from lots of different subjects. I did my post-doctoral research at a particle physics lab in the US, which was great, but I really enjoy also having colleagues who work in the humanities and the other sciences.  The conversations that you have in a College setting just over lunch can often lead to new ideas.  You might talk to someone who is facing very similar problems to you but in a different field and realise that there are ways you can work together or learn from each other.

What do you enjoy about teaching?

Teaching often reminds me of why I like Physics. I teach things that first got me interested in Physics and I really enjoy talking to the students. At Queen’s the students are all fantastic: they have great ideas and often ask very insightful questions that make me think about my own work in ways I hadn’t thought about it before.

Can you tell us about your YouTube series Even Bananas and who it’s aimed at?

I have a YouTube series with Fermilab, which is the US particle physics lab and the home of the neutrino experiments I am working on.  The series is called Even Bananas because neutrinos are produced by almost everything, even bananas. The idea behind the series is to give a fun introduction to neutrino physics.  Our aim is to make it accessible to everyone, particularly to people with no science background.  We feature guests who are experts in various aspects of neutrino physics who come and tell us about their experiments and specific problems that they are working to solve, as well as answering viewers’ questions.  My favourite episode was in an answer to someone who asked how big neutrinos are, and it involves my husband throwing beach balls at me from behind the camera.

Can you recommend a book? 

Neutrino by Frank Close. This gives a great overview of the history of neutrino physics and an excellent description of the experiments that went into discovering that neutrinos can change from one type to another.

Even Bananas Video Series

We spoke to new Fellow in English Professor Tamara Atkin about her research into the material conditions that shape literary production and reception. 

Your research examines the material conditions that shape literary production and reception.  Can you tell us a bit about what this means?

The really short answer to this question is that whilst there is nothing especially material about a text, its transmission is very often predicated on it being given physical form (there are some exceptions here, and even in predominantly literate cultures, there remain some genres and modes associated with orality). I’m interested in studying the technologies that enable material transmission—writing, printing—but also thinking about the ways in which the affordances of manuscript and printed textual production mediate the receptive possibilities of a given text.

I realise that might be a bit hard to grasp, so perhaps I can explain what I mean with a concrete example. In the epilogue to Thomas Dekker’s satirical play Satiromastix (1601, published 1602), Captain Pantilius Tucca encourages the audience to goad Ben Jonson into writing a rebuttal satire declaring, that in so doing, Jonson ‘shall not loose his labour, he shall not turne his blanke verses into wast paper’. What’s interesting to me about this quotation is the way that Tucca suggests that when lost or wasted, intellectual labour—that is, the immaterial work of writing dramatic poetry— is transformed into physical waste paper—which in an early modern context typically meant printed or manuscript leaves recycled for another use.

In practice, what all this means is that whilst my research takes the material text as its object of study, I try to use codicological* and bibliographical practices and techniques as a way of thinking through quite broad questions about literary production, authorial labour, and textual reception.

What are your findings on premodern drama?

The first ever texts I read as an undergraduate were the late-medieval morality plays Mankind and Everyman, and I have remained fascinated with pre-Shakespearean drama ever since then. Most critics of medieval and Tudor drama have written about the theatricality of these plays, and a lot of really great and important work has been done to reconstruct their original performance conditions. In contast, I’ve always been interested in their textual history because, when you think about it, it’s not self-evident that plays should always be written down. Do dramatic texts represent a record of performance or are they designed to enable it? When did reading drama as an activity apart from or separate to performance become a ‘thing’?

These are the sorts of questions I set out to answer in my last book Reading Drama in Tudor England (Routledge, 2018). I wrote this book because I wanted to understand the print reception and status of drama before Shakespeare et al. began writing for the commercial stage. I learnt that early on printers developed conventions for articulating drama as a printed form – by this I mean that they established norms that determined the look of drama on the printed page, like the use of stage directions to indicate stage business and speech prefixes to organise dialogue. It has often been said of these features that they encode and thereby enable performance, but in writing Reading Drama I became increasingly convinced that printers developed and used features like character lists and stage directions not to enable performance but rather to signal the idea of performativity. These readily recognizable features, these conventions, act as a guide to tell us how to read the book and imagine it as a play.

In the context of Tudor drama this point is really important because a lot of these plays are routinely dismissed as sub-literary, crude pecursors to the more literary drama of Shakespeare and his contempories. My work on Tudor drama suggests something different, namely, that as early as the 1550s and 1560s, drama was being printed for leisure-time consumption, as a genre of writing worthy of reading.

Your Leverhulme-funded work on the reuse and recycling of old books explores literary ideas about waste and reuse.  What conclusions have you drawn about the role of old books in early modern culture?

So many! Right now, I’m thinking about the ways that binding waste—which is to say, the dismembered bits of manuscript and printed texts recycled in the bindings of other, newer books—highlights the inherent instability and unfinishedness of the early modern book. All books are unfinished insofar that the making of meaning lies with the reader. But early modern books, which were typically sold stab-stitched but otherwise unbound, often with errata lists calling on the reader to correct errors, draw attention to their status as unfinished objects that required reworking.

Early modern books draw attention to their status as unfinished objects that required reworking.

Manuscript and printed waste represent examples of objects so heavily reworked that they simultaneously lose their materiality and are reduced to it. Binding fragments survive because they have been repurposed to secure the durability of other books. As fragments, however, they are also ghost-witnesses to texts that have become immaterial, incomplete, and unknowable. When these fragments coalesce with the leaves of the texts whose bindings they strengthen, they offer a stark reminder that textual value is contingent on readerly taste and judgement, and that irrespective of the author’s ambitions, all texts are subject to market forces that makes them susceptible to dismemberment and reuse.

The College’s Centre for Manuscript and Text Cultures looks at pre-modern epigraphic traditions across cultures. What observations have you made about the interactions between manuscript and print?

Like other members of the College’s Centre for Manuscript and Text Cultures, I value the ways that working on pre-modern books creates unique opportunities for inter- and multi-disciplinary collaboration. For instance, I have recently been working on a collection of books in the Bodleian once owned by the twentieth-century collector and bibliophile Albert Ehrman. The collection has some incredible early sixteenth-century books in contemporary bindings, many of which contain uncatalogued manuscript and printed fragments. Identifying these fragments – some of which have proven to be very rare or otherwise unusual – has led to opportunities to collaborate with leading scholars in other fields, which has been an amazingly stimulating and often humbling experience.

Working on the printed and manuscript fragments that turn up in the bindings of other books has also challenged me to think about the relationships between different forms of textual technology. Again, I can probably best explain what I mean by way of example. In the Bodleian, there’s a copy of a 1572 edition Plutarch’s Moralia in a binding that makes use of a fragment from an early fourteenth-century mass book as a spine support. It is clear from the heavy red ink that stains this fragment that it once served an intermediary function as a frisket sheet before it was repurposed as binding waste. For printing in red, printers used friskets, from which holes were cut out to allow selected areas of the inked metal type to be printed. In the crust of red ink on the missal fragment it is possible to make out the words ‘patri[s] ⁊  filii’ (Latin for ‘father and son’, as in the phrase ‘in the name of the father, son, and holy ghost’). These words make it clear that this frisket sheet was used for the printing of a ‘black letter’ liturgical text. Black letter is a print typeface based on a medieval handwritten script known as textualis quadrata. Though produced using a different technology, the printed words caught on the manuscript leaf therefore mimic the appearance of the handwritten missal, and in doing so, blur the distinction between printed and handwritten text.

Missal fragment recycled as spine support. The crust of red ink indicates this fragment served an intermediatory function as a frisket sheet for the printing of red ink.
Oxford, Bodleian Library, Lincoln CHS 36. Reproduced with kind permission under Creative Commons licence CC-BY_NC 4.0. Missal fragment recycled as spine support. The crust of red ink indicates this fragment served an intermediatory function as a frisket sheet for the printing of red ink.

Is there an item in the College’s book collection that you’re particularly keen to see (and, if so, why)?

It’s very hard to pick one! I’m excited to go and spend some time with the card index, as I am interested to know more about the kinds of readers who have interacted with the College’s historical collection and the sorts of ways they recorded their engagement. To give an example: I have for many years been interested in the writings and other activities of the notorious Protestant polemicist John Bale (d. 1563). The College holds several books associated with him, including a copy of an English translation, very likely by Bale, of a Latin tract in defence of the Royal Supremacy. The College’s copy remains in its original sixteenth-century blind-tooled binding, and the card catalogue enticingly notes that the margins and endleaves are full of sixteenth- and seventeenth-century manuscript notes and additions. Who was or were the reader or readers responsible for interacting with this book in this way? What can we learn about the status and value of books as objects from these manuscript additions? And how do such marks of readerly engagement nuance our understanding of religious controversy in the sixteenth century?

Alongside evidence of ownership and reading, I’m excited to think about the ways the College’s historical holdings can enliven my current research into both manuscript and printed waste, and the early modern second-hand book trade. The catalogue entry for the book I’ve just mentioned describes a calf binding over wooden boards with remnants of metal clasps. This style of binding is typical of bindings produced in the first half of the sixteenth century, and it is very common to find wasted manuscript fragments used as pastedowns on the insides of the boards. By surveying books in historical bindings, I’m excited to discover new manuscript and printed fragments that have thus far escaped cataloguing!

In my work on the second-hand trade, I’ve been making an inventory of booksellers’ notes, since these can offer a glimpse into little known or understood trade practices. For instance, in the early modern period, when books were sold second-hand, if they were especially old, big, or valuable, it was not uncommon for a bookseller to add a ‘warranted perfect’ note, guaranteeing the completeness of the copy for sale. I’m looking forward to spending time with the College’s holdings in sixteenth- and seventeenth-century bindings, as notes like these were typically added to pastedowns or flyleaves. I’m keen to learn more about the lives of these books before they came to Queen’s.

What are you looking forward to about being at Queen’s?

As you can probably tell from my answer to the previous question, the Library has a huge draw, and I am excited to work with and alongside Librarian Dr Matthew Shaw and the rest of the Library team as I get to know the collection better. I’m especially keen to encourage undergraduates to work in and with special collections with confidence, and I can see various opportunities for bringing Queen’s students into closer contact with the College’s amazing collections.

I’ve only been here since September, but it’s clear that Queen’s has an incredibly welcoming and rich community of academics, staff, and students. I’m really looking forward to getting to know colleagues and students better and building on the conversations I’ve already had over lunch and dinner to work collaboratively and across different disciplines.

Do you use the special collections in your teaching?

Yes. Last week I took my second-years to the Weston Library and we looked at a selection of early modern manuscripts, including Bodleian MS Tanner 307, a scribal manuscript containing 167 poems by George Herbert, which were subsequently published as  The Temple (1633). This manuscript may have been prepared to obtain a licence for that edition, and it was fantastic for students to have the rare opportunity to compare the mansuscript and print versions.  Tomorrow, we will visit the College Library to examine items selected by students that complement their work for the early modern period paper. It can be quite intimidating to work with special collections, particularly for undergraduates, and I’m so grateful to Dr Shaw for enabling sessions like this, which can really transform the way students think about and work with literary texts.

What are you working on at the moment?

I’m currently finishing a monograph, Reusing Books in Early Modern England, that considers the long lifecycle of manuscripts and books after their initial production and reception. Work for this project has been supported by a Leverhulme Major Research Fellowship, which has allowed me to spend a lot of time digging around in libraries and archives – a huge luxury! Once finished, the book will bring together several of my longstanding interests: the cultural and intellectual habits formed by the Reformation; early modern book history; and the interplay between material and metaphorical language. It’s been enormously fun and rewarding to research, and I am now enjoying the challenge of turning that research into a piece of long-form academic writing.

My next project is about the early modern second-hand book trade, which surprisingly has been very little written about. I’m currently putting together an application to secure funding for a team to undertake this research collaboratively; it’s a big and ambitious project and will benefit from scholarly expertise across a range of different areas. I want to know who bought and sold second-hand books, where they came from, how they were valued, and what role they played in the making and unmaking of both private and public collections. In answering these sorts of questions, I think we can begin to challenge conventional wisdom about the English early modern book trade, which has mostly focused on new books produced in London.

Can you recommend a book? 

This is such a hard question. I’m going to do my best not to squirm out of it, so will recommend The Safekeep by Yael van der Wouden. Why this book? For me reading is so often a professional activity, something I do at a desk, in a study, or in the library. I read this book on holiday, largely whilst lying in a hammock next to a swimming pool where my children were playing. I’m sure part of the pleasure I took in reading it was in the heat of the air, the nearby sounds of my children, the whole aural and sensorial experience of being at rest. And as I lay, prone, reading, I enjoyed the way that van der Wouden was able to manipulate my response to the main character.

I started out with little sympathy for Isabel, a young woman living in a rural area of the Netherlands in the aftermath of the second-world war. She seemed pettily parsimonious, obsessive, and controlling. But over the course of the novel, as she struggles to come to terms with her mother’s death, with her insecure hold on the house she calls home, and with her feelings for her brother’s girlfriend Eva, I found myself liking her more and more, becoming increasingly invested in her material and emotional fate. Add to that the wider context of life in postwar Holland as the country struggles to make sense of the years of Nazi occupation, and I found it a compelling and thrilling read.

The Safekeep was one of the first books I read on my new Kindle. As I’ve already indicated, I’m really interested in thinking about the ways different textual technologies mediate the reading experience – how reading a printed book differs to reading a manuscript book—and I think these sorts of questions are equally pertinent to newer forms of text. Reading novels on my Kindle is great for travelling, but I miss the tactility of holding a book, of flicking back to cross-check a reference, and I love that my physical books hold traces—in dog-eared corners, forgotten bookmarks, the occasionally underlined word etc.—that record my experience of reading them. I’m not getting rid of my Kindle just yet, but if anything it’s reminded me how great – how irreplaceable – books are!

*Pertaining to the study of the book; taken from the Latin word codex meaning book, codicology refers to the study of the whole manuscript book, all its physical and historical characteristics

In Michaelmas Term 2024 Professor Christina Davis joins us from Harvard as the College’s PPE Centenary Visiting Professor. Ahead of her PPE Centenary lecture on 13 November, we asked her about her research in international relations and why the effects of complex trade-offs between nations are felt across industries.

What first interested you in geopolitics and international relations?

I have been studying international relations for many years and looking at how international law can both constrain states and help them cooperate. I wanted to think more about the conditions that make states willing to accept constraints and reduce their freedom by joining an international organisation that has rules of conduct. In my most recent book I was looking at the politics of joining international organisations.  I have long been a scholar of international trade, and this new research was a chance for me to examine how the World Trade Organisation was able to expand to include almost all countries when it had started as just a very small group.  I am interested in how the decision is made by a country to try to join an organisation like this, and how others decide if they are willing to let new countries join.

Your recent book Discriminatory Clubs: The Geopolitics of International Organizations reveals what you refer to as ‘the discriminatory logic at the heart of multilateral institutions’. Can you explain what you mean by this?

Sometimes international organisations pretend to be rule-based but allow great discretion over who can join. It’s the nature of geopolitical ‘clubs’ like the World Trade Organisation that they can decide both whether they want free trade and with whom they want free trade. I study how geopolitics can raise tensions with the ideal that international law serves an objective principle. One part of ‘joining the club’ is access to specific benefits, and one part is the social status of closer association with a particular group.  It’s the value of association that encourages discrimination, which is something we see when we look at a golf club or a social club, for example. Too often we study international organisations and think of them in terms of an abstract contract. I argue that it’s more than a contract: states are making decisions to cooperate based on more than who’s the best trading state or who has the best law for environmental protection.

Too often we study international organisations and think of them in terms of an abstract contract. I argue that it’s more than a contract: states are making decisions to cooperate based on more than who’s the best trading state or who has the best law for environmental protection.

States introduce other criteria that are not in the law, for example about their allies and cultural factors.  What I find overall with a lot of international politics is that the discrimination quotient is on security: states favour their friends and security interests.  Even if it’s an organisation about economics, you still find a large security component that favours cooperation among allies.  When I use the social club analogy, I say that international organisations are more like the golf club than the football club: there’s no try-out to see who’s the best player, the most qualified to participate, it’s much more about who you would like to work with.  A long-term relationship of mutual interest and valued association makes states want to join some clubs over others.

Does your research show how we might support cooperation within these organisations given the nature of the framework that you have uncovered?

There’s a role for a small group of like-minded states that share common interests and security to make compromises necessary for hard cooperation.  In this case discrimination that favours a small group could be advantageous to moving forward.  However, we should consider the costs of excluding a state. Are we closing off cooperation that might otherwise help to achieve a more stable trading system or wider action to protect our environment? Such concerns might lead one to expand the club beyond the small group of like-minded states, which is what happened in the World Trade Organisation. It’s important when we’re looking at why organisations are more or less successful at cooperation to think about the trade-offs. Some organisations are less effective because states don’t trust each other and don’t share enough commonality. Part of the process of trade-offs is that if you expand to be a very diverse group, it’s harder to achieve a single goal.

Can you tell us about your research into the effects of peer conformity on economic sanctions?

There is a recognition in international relations that there are spheres of influence and clubs of states that act together on a whole range of issues. One way to demonstrate your unity with your group of states is to form an alliance and another is to join the same international organisation and take similar positions on the critical issues in the world.  We see this when G7 nations make a common statement in support of sanctions against Russia or on another issue, and there are many countries that get caught in the middle because they don’t want to take sides.  We need to recognise that it’s sometimes difficult to join an international organisation when the decision is not just perceived to be about joining the common market but about appearing to take sides.  That is why Ukraine wanting to join the EU was a threat to Russia. I have also been analysing UN voting and looking at the informal politics that can inform a group of countries and their approach to each issue brought before the General Assembly of the UN. There are patterns of association; it’s not just a country acting on its own interests but following a group of states who have similar policies.

While my book engages with how states can show club behaviour, I also work on the effect of peer pressure on firms who can get caught in the middle. You might think that firms are just looking at the bottom line and cost/benefit maximisation and yet increasingly, even firms are being pressured into taking sides on geopolitical issues. 

You might think that firms are just looking at the bottom line and cost/benefit maximisation and yet increasingly, even firms are being pressured into taking sides on geopolitical issues. 

This can be difficult because their managers are more trained to study profit margins. Deciding whether to continue trading in the context of war if you have a subsidiary in the country is a statement about your perception of the conflict. For example, government sanctions didn’t require the withdrawal from Russia but we’re finding that some businesses are going beyond what’s required.

That’s where it’s interesting for me to study the peer pressure that emerges. A company hears that x and y have withdrawn and might think that it should also withdraw.  One of my favourite examples was a survey experiment I conducted that looks at Japanese business decisions toward sanctions against Russia. In this research, my co-authors and I discovered a balancing act of influence — hearing of European and American firms’ withdrawal made managers more likely to think Japanese firms should also withdraw but, at the same time, seeing that their Chinese competitors were continuing to do business had an off-setting influence which made some Japanese managers cautious about whether to reduce their economic transactions with Russia. I am interested in the strategic interactions that go beyond an anonymous firm looking at the market.

Have you seen a growing influence of the role of geopolitics in market decisions?  And, if yes, why do you think that might be the case?

Yes, I think that it has become increasingly important to think about geopolitics to understand the global economy. If we were to go back 20 years, it might not matter as much to think about nationality when making a trading decision. Now, everything from concern about possible future sanctions harming the value of a transaction, to the views of your shareholders and consumers affects decisions. I think we’re seeing more and more firms taking these questions into consideration and there have been studies looking at how joining (and exiting) international organisations can shape your risk evaluation.  I also see more students who are studying sanctions and politics who go on to work in consulting firms. These firms are then increasing their geopolitical risk assessment, and law firms are expanding their compliance offices.  This is also true for banking and consumer goods. Trade is no longer set by economic principles alone.

As a graduate student you spent time overseas; how did this affect your approach and methods as a researcher?

It is always good to see the country you are studying and often a chance experience there can help you rethink a problem.  I studied agricultural trade and why Japan protects their rice market and was interested in when governments should use protection and pay subsidies within an industry to ensure economic security. I was in Japan as an exchange student in the summer of 1993 when they had a bad harvest and a shortage of rice. I queued up with everyone at the grocery store to get the last bags of Japanese rice and nobody wanted to buy the imported Thai rice. Fortunately, I was happy to buy the imported rice. This experience showed me that you can achieve food security through many paths: if you think you have to produce everything at home then there’s the risk that when something like a bad harvest happens, you won’t have enough to eat. 

I talk to my students about how our academic questions are much more interesting when you experience them in person. For me, standing in a line to buy rice highlighted that trying to achieve self-sufficiency wasn’t necessarily the safest approach. At the same time, I also met farmers and learned about the value of their role in the community and as providers of food. International exchange is all about understanding different people’s perspectives.  I gained a lot from talking to politicians and farmers in Japan because it gave me more insight into how they viewed a problem. It’s important to look at an issue beyond the academic question.   

What are you looking forward to most about your time in Oxford?

It’s great to be given this opportunity to meet a new intellectual community. Oxford has incredible scholars in international relations, but I am also already meeting many scholars in other fields and there’s so much to be learned from subjects outside your own field. I am particularly looking forward to this intellectual exchange.  I have started a new project on trade diplomacy and it’s great to have the time in Oxford, and the connection to the PPE programme, to take an interdisciplinary approach when thinking about my research questions.  I hope that I can bring an interdisciplinary and historic approach to the current interest in advancing economic security in a global economy.

Tell us about the forthcoming PPE Centenary lecture and who should attend.

I would like to discuss the complex trade-offs brought about when we have rivalry between nations at the same time as deep interdependence. We need to think about how we make choices.  What are the types of goods where a country is comfortable depending on others for supply? At what price do we want economic security by raising protection or spending taxpayer money to subsidise domestic production?   This is an old problem. We can go back in time and think about why countries have supported their shipping or steel industries, and then look ahead to why today we are making choices about semi-conductors or whether it’s safe for our kids to use Tik Tok. It’s a fundamental question of how the nation relates to the international economy.  It requires us to think about ethics, the role of government, and trade-offs between competing interests for economic efficiency, global supply chains, and caution about depending on others.  Some would argue we should only rely on countries we know and trust, but then we must decide what counts as trust among nations. 

I hope the lecture will interest people who haven’t thought much about economic problems so they can recognise that the costs of sanctions are being borne by many of us and that the risks of interdependence are important across spheres. 

I hope the lecture will interest people who haven’t thought much about economic problems so they can recognise that the costs of sanctions are being borne by many of us and that the risks of interdependence are important across spheres.  For example, if you’re working at an international firm, you might be trying to comply with sanctions policies; or you might be developing the latest technology and need to consider who is the end user. If you are a researcher in the lab or engineer designing new robotic technology, how widely will you share your scientific ideas?  All of us have to make these trade-offs about whether to openly engage with the international society and economy.  So, I would be thrilled if people attend who do not have a lot of prior experience with economics.

Can you recommend a book? 

Global Discord: Values and Power in a Fractured World Order by Paul Tucker. This is a fascinating book about international institutions and how to think about why there’s been so much backlash today about cooperating in a rules-based order. Paul Tucker was a very senior British banker (the Deputy Governor of the Bank of England) who left banking to enter academia. He advocates for a more thoughtful approach to the ethics and politics of international finance.  The reason I like the book is because it’s one of these incredible cases of a very senior policymaker engaging in academic debates and I really value it when practitioners take the time to engage with academic work.  I think any PPE student should read this book.

We spoke to Dr Josu Aurrekoetxea as he comes to the end of his time as Beecroft Extraordinary Junior Research Fellowship in Astrophysics at Queen’s.

Your research asks some big and fundamental questions, such as ‘how did the universe begin?’; how do you break these big questions down into smaller ones in order to try and find answers?

We have a very powerful framework for studying such questions: Einstein’s theory of General Relativity. It has been remarkably successful in explaining observations across scales—from the dynamics of the planets in our solar system to the evolution and formation of galaxies in the universe. This theory provides the theoretical basis for asking fundamental questions about the origin and fate of the universe.

To break these questions down, we often start with the simplest possible scenarios—what we humorously refer to as “spherical cows”. Once we understand what the mathematics tells us about this simplified version of the problem, we gradually add layers of complexity, always guided by the observational evidence we have.

How do you use computing in your research?

The traditional picture of theoretical physicists working on blackboards, often shown in movies, feels somewhat outdated today. While blackboards are still great for brainstorming and working through ideas, most modern physics research relies heavily on computing for handling complex calculations and analysing data from observatories and experiments. Computers also allow us to create visualisations of simulations, which not only help us better understand what is happening but are also great for presenting in talks. Coding has therefore become an essential skill for physicists.

For example, the equations of General Relativity, which describe phenomena like black holes or the universe itself, can only be solved “with pen and paper” in very specific, simplified cases—often under idealised conditions (those “spherical cows” I mentioned!). To explore more realistic and complex scenarios, we need supercomputers. These machines are fundamentally just adding up zeros and ones, but they do it much faster than we ever could!

Your novel research programme explores the collision of black holes; what have you learned about these events?

We know that black holes inspiral and eventually merge, forming larger black holes. During this process, they emit gravitational waves—ripples in the fabric of spacetime—that cause space to stretch and squeeze in a specific pattern. This pattern, known as the ‘waveform,’ carries valuable information about the colliding black holes, such as their masses and spins. The first direct detection of gravitational waves was made in 2015 using highly sensitive instruments called interferometers. Since then, the LIGO, Virgo, and KAGRA gravitational-wave observatories have detected over a hundred such events, opening a new window into astrophysics where gravitational waves, unlike electromagnetic waves, allow us to observe objects that are otherwise invisible to traditional telescopes.

We also know that roughly 25% of the universe is composed of dark matter, an elusive form of matter that we haven’t directly detected yet. Some black hole mergers are expected to be surrounded by dark matter, leaving subtle imprints on the gravitational waveforms during collisions. In our research, we use supercomputers to calculate what these signatures might look like. What we have discovered is that dark matter can speed up the merger process of black holes and introduce “dephasings” in the expected signal—small shifts in the waveform patterns. Detecting these effects would provide key insights into the nature of dark matter and the environments where black holes reside. We are currently using our newly developed signals to analyse the data obtained by the gravitational-wave observatories in hopes of identifying these dark matter signatures.

What fascinates you most about your subject?

Our ability to use mathematical methods to predict and understand natural phenomena, and then build experiments that can test these hypotheses. Patience is key, as these predictions can take a long time to be verified. For example, it took 100 years from the prediction of gravitational waves to their direct detection. We had to develop instruments sensitive enough to measure the stretch and squeeze of space on the order of one thousandth of the diameter of a proton! A century later, we can now measure such tiny displacements and confidently say they were caused by two black holes, each 30 times the mass of the Sun, colliding billions of light-years away from us.

What have you enjoyed about researching and teaching at Oxford?

As Beecroft Fellow and eJRF at Queen’s, I have been fortunate to have complete freedom in my research. This has allowed me to explore new directions and broaden my expertise. The vibrant atmosphere, with many postdocs, visiting researchers, and seminars, makes it an exciting place to work. Teaching at Queen’s has also been incredibly rewarding—getting to know the students, witnessing their progress, and seeing the next steps they take in their careers.

What is your favourite thing about Queen’s?

I really enjoy the many opportunities the College provides to meet students, Fellows, and Old Members. The staff are fantastic; they have always been incredibly helpful. My favourite place in College is the Upper Library, where I have spent many hours seeking inspiration for writing articles and research proposals.

Can you recommend a book? 

The book that first caught my attention in theoretical physics was the classic A Brief History of Time by Stephen Hawking. I remember being fascinated when I read it in high school. Recently, I read Comets, Cosmology and the Big Bang, a historical book that discusses the contributions of Edmond Halley and Edwin Hubble to astronomy—both of whom were Old Members of Queen’s! Up next on my reading list is The Perfect Theory by Oxford cosmologist Pedro Ferreira, who is both a great scientist and communicator.

I also enjoy watching scientific content on YouTube, where there is plenty of material available. Channels like MinutePhysics, Veritasium, and DrBecky (among many others) make complex topics accessible to a general audience and are fantastic resources for anyone interested in diving into science.

What’s next for you?

I am super excited to be joining the Center for Theoretical Physics at MIT as the inaugural CTP postdoctoral Fellow this October. Many groundbreaking advances in early universe cosmology and gravitational waves have been developed there, and I am looking forward to contributing my expertise to the department’s research, while also expanding my network of collaborators in the US. That said, I am leaving a few textbooks behind in Oxford that won’t fit in my suitcase, so it won’t be long before I am back at Queen’s for a visit!

This blog is based on an article published in the journal Science: “Drivers of epidemic dynamics in real time from daily contact tracing COVID-19 measurements”.

Dr Jasmina Panovska-Griffiths (Lecturer in Probability and Statistics)

In my recently co-authored study published in Science, we used digital contact-tracing data alongside mathematical modelling to provide insights into the COVID-19 epidemic dynamic in England and Wales. The study was led by researchers from the Pandemic Sciences Institute at the University of Oxford alongside researchers from Warwick University, and included contribution from data scientists from UKHSA. The core of the work comprised analysis of digital-contract tracing data from the NHS COVID-19 App for England and Wales which detects proximity between phone apps to alert people at risk of infection.

The contact-tracing data allows separation of contacts and transmission by day of contact and by settings and facilitate exploration of whether the contacts and transmission during the week and the weekend differed. Our results showed high variability in this; for example transmissions from very short exposures of less than 30 minutes were more frequent during the weekend than in the week, notably more than twice as common on Saturdays than on Mondays. Furthermore, exploring this pattern over different time periods, and specifically over periods of high social-mixing, also suggested an association between high social mixing and increased COVID-19 transmission.

For example, during the Euro football tournament in 2021, on the days that the English football team played matches there were sharp spikes in both numbers of exposures and the transmission probability (Figure below). Notably, on 11 July 2021 when England played Italy in the final, transmissions were between six and nine times higher than what would otherwise have been expected. Since the football matches in 2021 were played in football stadiums all across Europe and not in one specific place, the peaks in transmission on match days can be attributed mostly to social gatherings in homes and pubs, rather than to supporters at the stadium. This analysis is rather topical as, at the time of writing, there is a Euro 2024 football tournament and we should be grateful that there isn’t a highly-transmissible COVID-19 in circulation as we gather together with friends to watch the matches.

In addition to untangling the association between social mixing and transmission, in the paper we also used the contact-tracing data to evaluate the temporal distribution of the effective reproduction number R (the average number of secondary infections that emerge from one infection) over the study period. During the COVID-19 epidemic in the UK, R was generated weekly and fortnightly by the UKHSA Epi-Ensemble modelling group under my leadership and in collaboration with academic partners across different UK universities. In the paper, we compared the changes in R derived from the ensemble of mathematical models generated by us, versus the R from the contact-tracing data. The results illustrated that using digital contact tracing to estimate the reproduction number R can be an alternative approach which allows epidemic changes to be seen at least five days earlier than via our ensemble modelling. Hence digital contact tracing can be a useful alternative early-warning system of epidemic change.

Overall, this interesting study illustrates that digital contact tracing can not only help control epidemics by identifying and directly preventing transmission, but is also a useful innovative approach for epidemic nowcasting.

a graph showing the correlation between football matches and the daily number of contacts and transmissions detected by the COVID-19 app. There are clear spikes in numbers on dates when there was a football match
Figure shows the correlation between football matches and the daily number of contacts and transmissions detected by the COVID-19 app. There are clear spikes in numbers on dates when there was a football match.

Third-year Queen’s students Jiahe Qiu (JQ; Mathematics and Statistics) and Daniel Bell (DB; Mathematics) are in discussion with Dr Jasmina Panovska-Griffiths (JPG; Lecturer in Probability and Statistics) about a themed special issue ‘Technical challenges of modelling real-life epidemics and examples of overcoming these‘ published recently in Philosophical Transactions of The Royal Society A.

(JQ to JPG) You were recently a lead editor on a special issue at the Phil Trans Roy Soc A alongside two of your colleagues. Tell us a little bit more about the special issue.

JPG: This invited theme issue is a collection of 16 invited papers that highlight the scientifically diverse contributions that the Royal Society RAMP initiative has made over the last two pandemic years. The papers within the special issue are from a diverse range of academics, most of whom contributed to the RAMP initiative, who have been modelling the transmission of COVID-19, borrowing methods from not only classic epidemiological modelling, but also from statistics, theoretical physics, molecular biology, and algebraic theory among others. The special issue gave a platform to showcase different approaches and discuss technical issues identified during the COVID-19 epidemic that are relevant when modelling future ones.

(DB to JPG) Is the RAMP initiative something new that emerged over the pandemic?

In February 2020, the UK already had a number of strong, established epidemiology groups and an independent national advisory body, SPI-M, which had a lot of expertise in modelling infectious disease spread. The RAMP initiative offered an additional set of diverse modelling expertise as a novel and experimental way of organising science during an emergency. The whole initiative was designed and led by scientists and operated on a volunteer basis and without a budget, supplementing the SPI-M work.

(JPG to JQ and DB) As students of mathematics, which aspects of the mathematical application in the special issue did you like most and least?

DB: I really enjoyed reading about the data analysis and inference methods that went into informing decisions. In particular, over the last two years, we have all heard so much about the effective reproduction number ‘R’ in the news, and I found it really interesting reading about how this is actually calculated in the paper by Ackland et al.

On the other hand, although the application of many of the papers is apparent, I would have liked it if there were more of an explanation of if, and how, the methods outlined were actually used during the pandemic. For example, keeping with the R theme, it would have been useful to know what the other models used to generate R were and how similar and different they are.

JPG: That is a very good point. In fact, Ackland et al is one of the models that has been used to generate R throughout the pandemic, as were a number of other models in the special issue, including my Covasim model for England. There is a complex mathematics (i.e. statistics) behind the combining of the R outcomes from a number of models described here, and we are currently working on a scientific paper that outlines how this method was applied by SPI-M and more recently by UKHSA to generate a combined weekly R value that is published on the UK Government Dashboard.

JQ: I was most fascinated by the article on Covasim agent-based model where you were a lead author. It gives a simple yet insightful example of how we can integrate geographical information into statistical models: by summarising common society interpersonal dynamics to school, work, household, community and assigning individuals by age group. I would say I enjoyed the articles – reasonable, refreshing, and very friendly to rookie statisticians like me. I also downloaded and played with PTRSA and Covasim repositories mentioned in the article. Is this work ongoing and what are the key questions you are looking at now?

JPG: I am glad you have enjoyed the paper on the application of the Covasim model and that you have tried running the simulations using the named GitHub repositories. This work is ongoing, and I continue to track the COVID-19 epidemic with this model. In addition, I am using the calibrated Covasim model, combined with statistical analysis of the genomic data, to explore the progressive transmissibility of the different Omicron variants over late 2021 and during 2022, and to evaluate the impact of the booster vaccinations in the autumn of 2022 and look at possible future epidemic trajectories.  

(JQ to JPG). Maybe I omitted something, but there was one thing in this article led by you that I did not fully follow: how did the assumptions of transmission probabilities and the effectiveness of social-distancing rules come out? For example, “When schools reopened from 8 March 2021, we assumed a reduction in the per-contact transmission probabilities by 37% in schools” (from rsta.2021.0315 P8). Were these something biological and sociological, or were these also based on some statistical results?

JPG: Your question on how we fixed transmission probabilities across the different (household, school, workplace, and community) layers is a very good one. Deriving this number is tricky as it comprises a number of biological (e.g.. viral intrinsic transmissibility) and sociological (e.g. strength of contact mixing patterns in society) parameters that we don’t necessarily know. For the purposes of this study, the value we have used was derived as a combination of using data and statistical analysis. Specifically, we used the Google mobility data on the level of social mixing reported weekly, and a reduction in this that reflected the reduction in transmission from interventions such as mask-wearing or social distancing within schools (from existing literature) to generate a tight range of values for transmission probability within different layers of society. We then used this range to sample from during the calibration process and generate an average value that we used in the agent-based simulations.

(DB to JPG) The compositional modelling framework set out in the article by Libkind et al seems both intuitive and to have many advantages over traditional modelling. Is such a framework currently widely used, and if not can you see it gaining popularity? Does such a set up have any drawbacks that weren’t discussed in the paper?

JPG: The framework that Libkind et al use in their model is novel and hasn’t been used before for epidemiological modelling.  I think it is very interesting that their framework suggests that you can draw a parallel between ordinary differential and delay differential equations with Petri nets with mass action kinetics as semantics for the same syntax formalised using concepts from category theory. Some aspects of this are also used in the compositional modelling in the paper by Waites et al. While there are not any obvious drawbacks to this method, further testing of the framework – against different diseases and incorporating different interventions, and datasets – may increase the confidence in this method as an additional tool to use in future as part of ongoing pandemic preparedness modelling.

(JPG to JQ) As you study mathematics and statistics Jiahe, were there any specific aspects of statistical application that you enjoyed reading about in the special issue? Did this motivate you to think differently about these concepts?

JQ: The paper by Swallow et al that tracked regional epidemics using weighted Principal Component Analysis (PCA) was very inspiring. PCA is an unsupervised machine learning method you have taught us in first year statistics, and I was aware of its application on time series data, especially in finance. However, this paper was the first time I saw how spatial and temporal information are used in PCA – not as columns in data, but as weighted matrices. And how this can be used to identify a proxy for epidemic status. It was interesting to see that the results suggested hospitalisations were the epidemic metrics across the nations – this is in line with the policy changes (e.g. starting or stopping lockdowns) that were influenced by a change (e.g. increase or decrease) of hospitalisations.

(JPG to DB) And you Dan have been studying mathematical biology last term, including simple SEIR models. Was there a specific aspect of mathematics applied to epidemiological modelling that you enjoyed within the special issue?

DB: Yes, simple SEIR models and the behaviour of their solutions is something I covered in my second year. What really surprised me while reading the special issue is how often such models occur in epidemiological modelling. One of the papers that really stood out for me was the one by Campillo-Funollet et al. When studying SEIR models in my degree, I often wondered how the initial data was gathered and how the parameters were estimated, and this paper did a great job of explaining how both are difficult with the model. The observational model set out in the paper, while being more difficult to study the solutions of, seems to be a good step forward in addressing some of the drawbacks of the SEIR model.

(JQ and DB to JPG) Did you learn something new in the process of editing the special issue?

Yes, of course I did. While all of the articles in the special issue had aspects that were interesting and expanded on the standard ways of doing modelling, two articles that stood out for me were those by Vernon et al and Dykes et al. While Vernon et al illustrated an alternative calibration technique for agent-based models that seems more efficient and faster, Dykes et al highlighted how important visualisation is throughout the analytical process and not just to illustrate the outcome of the analysis. Using history matching in an emulator seems to be a more efficient method for calibration of complex models – and this is important as even simple rejection Approximate Bayesian Computation methods can take days or weeks to calibrate a real-life network based ABM with the speed declining exponentially with increasing numbers of model parameters.

Intrigued by this method, I have recently started a follow-up work to check the feasibility, plausibility, and applicability of history matching calibration methodology on my Covasim model for COVID-19 in England, in collaboration with the lead author on the article. I also learned how Dykes et al have tapped their work on breaking the widely held preconception that visualisation is predominantly for communication of results post-analysis with an article that highlights the importance of incorporating visualisation early in the analytical process. While this is standard in 21st Century Visual Analytics, and met with some success in the visual epidemiological modelling efforts, the novelty of this work is in stressing that visualisation works at various stages and in various ways through the analytical process to support epidemiological modelling. I also learned how to edit a themed issue within a peer-reviewed journal and even put my artistic skills to use by designing the front cover image.

I very much enjoyed the whole process – from early conversations about an idea behind the issue to producing the final product. I hope the themed issue will be enjoyed by both the epidemiological modelling community and the wider mathematics community.

Dr Christopher Metcalf, Lobel Fellow in Classics

Classics has a long history at Queen’s, and a distinctive profile. Ancient Greek and Latin formed a central part of the College since its foundation in 1340, for the simple reason that, before 1800, everyone would have done some sort of classical study, not least of Greek for the reading of the Bible. Classics continues to flourish at the College today, and has over time developed certain coherent and characteristic features: these include a special focus on the earliest Greek poets (such as Homer and Sappho); expertise in editing newly discovered primary sources (papyrology, epigraphy); and an interest in studying Greece and Rome in the context of the wider ancient world, particularly the ancient Near East and Egypt.   

These characteristics can be traced as far back as the end of the 19th and the beginning of the 20th century. This was the time of what has been described as ‘surely the most successful partnership in the history of classical scholarship’, that between Bernard Grenfell (BA 1892; Fellow 1894-1926), ‘the greatest name in British papyrology’, and Arthur Hunt (BA 1893; Fellow 1906-1934). They discovered an immense cache of papyri in a rubbish-dump of the central Egyptian city of Oxyrhynchus (‘The City of the Sharp-nosed Fish’), which gave up a wealth of Greek literary and documentary texts, many of them new, reused for various purposes and then thrown away: with this discovery, the face of classical scholarship changed forever. A great deal of the Oxyrhynchus material was then edited at the College by another Titan of Classical scholarship, Edgar Lobel (Fellow 1914-1982). An unmatched knowledge of Greek and an extremely sharp eye allowed him to produce judicious, unspeculative and authoritative texts of an enormous range of authors, not least the lyric poets such as Sappho and Alcaeus; these remain fundamental models for papyrological work.

© courtesy of the Egypt Exploration Society

A papyrus fragment from Oxyrhynchus containing parts of Sappho’s poem on the wedding of the Trojan prince Hector and his bride Andromache (fr. 44), published by A.S. Hunt in 1927. Recent research by Dr Christopher Metcalf has shown that Sappho’s poem was known to the Roman poet Vergil, and that it was used by him as a literary model in composing the famous Dido and Aeneas-episode in the Aeneid. The ancient poets were very careful readers of Classical literature; if we would like to understand their works today, let us try to be the same! © courtesy of the Egypt Exploration Society

Also part of the College at this time was Thomas Allen (BA 1885; Fellow 1890-1950), best known for his herculean efforts on the manuscripts of Homer, which he collated in great numbers and digested into families, whilst also studying in detail the transmission of the text. He published the Oxford Classical Text of the Iliad in 1931 and Oxford classicists use it for examinations to this day. At this time too the College produced a Corpus Professor of Latin in A.C. Clark (BA 1881; Fellow 1882-1913), the editor of Cicero, and a Regius Professor of Greek in Ingram Bywater (BA 1862), another man of deep and patient scholarship whose great work was a monumental commentary on Aristotle’s Poetics.  The College’s classicists of this period went on to make important contributions also in other areas.  Two stand out.  Canon Burnett Streeter (BA 1897; Fellow 1905-37), Dean, Chaplain and Provost, was one of the most distinguished New Testament scholars of his day: he did important and entirely original work on the textual tradition of Synoptic Gospels.  Archibald Sayce (BA 1866; Fellow 1870-1933) turned East and became one of the greatest Near Eastern scholars of his time.  Sayce began to learn Assyrian, Persian, Arabic, and Sanskrit while still at school; he read Vedic hymns with the great Max Müller as a Fresher, and deciphered cuneiform Ugaritic texts without a bilingual text to help him.  He pioneered the serious study of Assyrian, and made significant contributions to the study of Hebrew, Egypt, Ancient Greece and linguistic theory. 

The end of the 20th century has also seen a very distinguished group of Fellows leading the way in Classical studies.  Fergus Millar (Fellow 1964-76), knighted for his contribution to Classical studies, was a Fellow and eventually Camden Professor, and author of major studies of the crowd and the Emperor at Rome, and more especially of many books on Rome’s relationships with its neighbours, especially in the East.  He was succeeded by Alan Bowman (BA 1966), best known for work on the Vindolanda Tablets with their invaluable information about life on Hadrian’s Wall.  After a number of failures by others, Bowman and his colleagues’ use of multi-spectral photography solved the problem of decipherment.  He set up the Faculty’s Centre for the Study of Ancient Documents, which has major collaborations with the Physics Department and does a great range of imaging work with documents of all kinds. Contemporary with Alan Bowman was John Matthews (Fellow 1976-96), who was one of the pioneers of the study of the later Roman Empire, an area which is now firmly established as a major part of Classical studies. Our current ancient historian, Charles Crowther (Fellow since 2010), is the Assistant Director of the Centre for the Study of Ancient Documents, and specialises in Greek epigraphy, with museum and field projects in Chios and in the Commagene region of south-eastern Turkey. Crowther and Bowman are among the editors of the monumental Corpus of Ptolemaic Inscriptions, which includes not only Greek but also Egyptian (hieroglyphic and Demotic) documents, and whose first volume, on Alexandria and the Delta, was published in 2021.

This beautifully cut limestone inscription was found in excavations at Memphis, Egypt.

This beautifully cut limestone inscription was found in excavations at Memphis, Egypt, and then donated to the Ashmolean Museum in 1909, where it was kept in storage and forgotten until its recent rediscovery by Dr Charles Crowther. It is a dedication, quite possibly by Alexander the Great, to Apis, the sacred bull of Memphis. Ancient historians attest that Alexander conspicuously participated in the cult of Apis while he was in Memphis. This stone is therefore a monument to the many interactions between Greece and the wider ancient world, in this case Egypt. © Ashmolean Museum, University of Oxford

Recent and current literary scholarship at the College similarly builds on and develops a tradition that has existed for several generations. Here the College’s recent profile has been marked by Angus Bowie (Fellow since 1981), who began his career with a study of the poetic dialect of Sappho and Alcaeus, and has continued to work on earlier Greek literature, especially Homer, Athenian drama, and Herodotus: his commentaries on Homer and Herodotus in particular remain a vital part of undergraduate teaching. Close textual work continues to be done by Almut Fries, who teaches Greek and Latin at the College and is an expert editor of Greek lyric and drama (Pindar, pseudo-Euripides), and by Christopher Metcalf (Fellow since 2016), whose interests lie in both Classical and ancient Near Eastern languages and literatures: his publications have sought to clarify the links between early Greek poetry and ancient Anatolia and Mesopotamia, and have also included first editions of new literary sources in the Sumerian cuneiform script. Much of current Classical research at the College is now integrated with the Centre for Manuscript and Text Cultures, which the College established in 2018 to bring together scholars working on primary sources in a very wide range of ancient and medieval cultures, from early China to the Mediterranean.

The profile that results from all of this is a distinctive combination of detail—in particular through editorial work on ancient texts—and breadth—which is reflected especially in our work on the wider cultural contexts of ancient Greece and Rome.

Dr Christopher Metcalf is Lobel Fellow in Classics at The Queen’s College, Oxford, and Associate Professor in Classical Languages and Literature at the University of Oxford. His research interests include early Greek poetry and, more broadly, the history of literature and religion in early Greece, Anatolia and Mesopotamia. His latest book is Three Myths of Kingship in Early Greece and the Ancient Near East: The Servant, the Lover, and the Fool (Cambridge UP 2024).

Dr Conor O’Brien FRHistS, Fellow in History

Most people think of Oxford as a medieval city, but when taking undergraduate students studying the period 900 – 1300 on a tour (as I did for the first time last Trinity Term) you quickly realise that very little of Oxford’s first few centuries still survives to be seen above ground. The first textual reference to Oxford appears in the Anglo-Saxon Chronicle entry for 911, by which time the town was clearly a functioning defensive and commercial site (a ‘burh’). The original burh was essentially a square with the four-way junction at Carfax at its centre. Its most obvious sign above ground is, of course, the church of St Michael’s at the North Gate (now on Cornmarket) whose eleventh-century tower is the oldest building in the city and the only surviving remnant of pre-Norman Oxford.

Oxford seems to have had a difficult Conquest. Domesday Book in 1086 records 57% of the housing stock of the city as ‘waste’; much of this was probably the consequence of urban clearing for the new defences, with the castle, built in 1071, taking pride of place. Castle Mound, opposite Nuffield College, is still there to be seen and represents the major disruption and dislocation of the period. Oxonians like to complain a lot about the latest hair-brained council scheme, but the new ‘local government’ of the late eleventh century was presumably unconcerned about residents’ feelings as it demolished their homes and businesses. The Anglo-Norman defences are actually probably the best-preserved part of early Oxford. New College has had to maintain the medieval walls in its grounds for many centuries, so gaining access to that college is definitely the best way to see them. Somewhat less impressive remnants of a bastion still exist in the garden of the Old Boys High School (i.e. the History Faculty) and Deadman’s Walk at the top of Christ Church Meadow runs alongside a segment of wall which now marks the edge of Merton.

Deadman’s Walk itself is a little slice of medieval Oxford, connecting the Jewish quarter around Blue Boar Street with the Jewish cemetery just outside the East Gate – thereby keeping Jewish funeral corteges carefully outside the boundaries of the city. Not much sign of Oxford’s twelfth- and thirteenth-century Jewish community now exists above ground: there is a plaque on the wall of the Town Hall on Blue Boar Street and a rather nice stone in the rose garden just outside the University Botanic Gardens. You are not likely to stumble on either if you were not looking for them, which is a great pity because, although Oxford’s medieval Jewish population was small, it was important. The presence of Jews in Oxford tells us something about the importance of the city during the Norman and Angevin eras, both its economic importance and its connectedness with the wider, cross-channel world of high politics. It also reminds us that by twelfth-century standards Oxford was a cosmopolitan place with diverse languages and communities.

What would have been the most dramatic sign of Oxford’s importance is also only now marked by a plaque. At the Worcester College end of Beaumont Street, on the corner with Walton Street, a stone set amid shrubbery records the existence of the royal palace, the King’s Houses, that was built just outside the north wall of the city in the 1130s. All remnant of the palace is long gone, the last ruins having been demolished to make way for the current stately appearance of Beaumont Street, but Oxford was quite a significant royal residence in the twelfth century with two consecutive kings of England (Richard I and John) born here.

So, you’ll see that the undergrad tourist interested in seeing early medieval Oxford must use a lot of imagination. But there remain a few hints of the city’s earliest centuries. The little square between St Ebbe’s Church and the Westgate Sainsbury’s is hardly the most beautiful public space in Oxford, even if slightly improved by the construction of the new Westgate Centre. If you turn your back to Sainsbury’s, however, and look at the west wall of St Ebbe’s you will see a twelfth-century Romanesque doorway, planted into the current nineteenth-century church building. It is a lovely, somewhat unexpected, glimpse of Norman Oxford – lovely not least for the incongruity of the surroundings. Oxford does actually possess a very impressive complete Romanesque parish church at St Mary’s, Iffley (nice and close to the Isis Farmhouse for refreshments), but when undergrads hardly walk to the Bod, asking them to walk all the way to Iffley and back is unlikely to prove feasible for future tours.

Contact

The Queen’s College,
High Street, Oxford,
OX1 4AW

Find on map

Tel: 01865 279120

© 2025 The Queen's College, Oxford

Site by One