Centre for Statistics

Member Profiles: A Conversation with Colin Aitken

In the first of our series on member profiles, Centre for Statistics former Director Miguel de Carvalho interviews Colin Aitken, Honorary Professorial Fellow and retired Professor of Forensic Statistics.

Colin Aitken
Colin Aitken

Miguel de Carvalho: Hello Colin. It is a great pleasure to have the opportunity to discuss with you your career and main achievements, and to listen to your memories of Statistics in Edinburgh. When did you decide that you wanted to pursue an academic career?

Colin AitkenThere was no one moment when I decided an academic career was for me.   The concept of a university as a community of scholars interested in the pursuit of knowledge for its own sake appealed to me from my schooldays.   Whilst an undergraduate, also at Edinburgh University, I enjoyed the academic atmosphere and thought, naively, how much better to be in such an atmosphere if there were no exams to worry about.  It never occurred to me that, for an academic, exams were always there to worry about, year after year of setting, marking and grading!  

The academic career grew on me.   I progressed from my undergraduate degree to a postgraduate diploma and a PhD to a lectureship motivated partly by curiosity to see how far I could go in academia.   At that point I realised I had a career.   There was no need to look further.  The curiosity to see how far I could go never left me.

There was no need to look further. The curiosity to see how far I could go never left me.

Miguel de Carvalho
Miguel de Carvalho

MdCWhat was your first position after completing your PhD?

CATimes have changed since I was doing a PhD in the late 1970’s.   Then there was a demand for statisticians in academia and a PhD was not necessary for an appointment to a fully tenured lectureship.    I was appointed to the lectureship in the then Department of Statistics at The University of Edinburgh from a lectureship in the Department of Mathematics at The University of Strathclyde before I had graduated with my PhD.   It is difficult to imagine such a career progression now, appointment to two lectureships before a PhD graduation – it may even have been before the oral.  I took up the lectureship in Edinburgh in September 1979 and graduated with my PhD from the University of Glasgow in November 1979, having first matriculated in September 1975.   My lectureship in Strathclyde was from 1977 – 1979.   That was a hectic time as I was a new lecturer learning the ropes and trying to finish a PhD.

As I noted above a PhD was not necessary for a lectureship in Statistics in those days.  Several of my Edinburgh colleagues did not a have a PhD and were well-respected academic statisticians.

MdCHow did you reach Edinburgh?   

CA: My undergraduate degree was at the University of Edinburgh in Mathematical Sciences, a smorgasbord of mathematical topics.   When in my final year as an undergraduate it came to thinking about what to do next, I postponed a choice of career and chose the postgraduate Diploma in Mathematical Statistics at Cambridge University.  I had discovered a talent for statistics which conveniently went with my ideas of a community of scholars and gave me a good excuse to dabble in all sorts of disciplines – to play in other people’s backyards as one famous statistician once said.

The Diploma was a nine-month course, an alternative to Part III, which fitted in a dissertation within the nine months.   We students there thought we were hard done by – other universities allowed their students twelve months and awarded them an MSc degree.   We were only awarded a Diploma.   However, the course had a very high reputation.   Alas, it is no more, having been absorbed as one of the disciplines for which one can be awarded a Master’s in Advanced Study.

Cambridge Statistics then, as now, was a centre of excellence with David Kendall and Peter Whittle as Professors.     Bernard Silverman, Frank Kelly, Brian Ripley and John Kent were all doing PhDs at the time.

Medical statistics appealed for my project topic, partly because my family background was in medicine and partly because it had obvious practical importance.   When I enquired about good places to study for a PhD in Medical Statistics, Glasgow University was highly recommended.   As this was my home town and provided ready access to golf courses I was happy to go there.

I did not have funding for the full term of a PhD because of the Cambridge Diploma so, after a couple of years, I had to find a job which would give me time to work on and write up the PhD.   Fortunately, I managed to obtain a lectureship in the University of Strathclyde in the Department of Mathematics as a statistics lecturer– a tenured post just over three years from my undergraduate degree but something that was not unusual then.    By now, I had realised that modelling of data and problems of inference and interpretation were what I was attracted to and that the theorem proving of mathematics held no appeal – even if I could have done it.   I applied for and was offered a lectureship in the Department of Statistics, under David Finney, where the topics were naturally statistical rather than mathematical.  Once in Edinburgh there was no great attraction to move.   All that was wanted for home and work reasons were available in a beautiful location.

Once in Edinburgh there was no great attraction to move.   All that was wanted for home and work reasons were available in a beautiful location.

I have been fortunate to have had the opportunity to work in a university environment. There, one is encouraged to pursue knowledge for its own sake and not in response to particular problems. Over many years whilst the subject of forensic statistics was developing, I was able to assist in that development without the demands of solving particular projects to short deadlines. It is difficult to imagine another career in which I could have worked in such a way.

MdCWhat do you view as some of the main landmarks in your research career?

John Aitchison
John Aitchison

CA: My interest in the application of statistics to forensic science was first motivated by a consultancy problem whilst I was working on my PhD at the University of Glasgow.  A forensic odontologist at the Glasgow Dental Hospital had collected dental impressions from 200 patients to use as background data to aid in the evaluation of evidence in the form of a dental impression from a suspect and from bite marks found, for example, on a victim in an assault case. He wanted a statistical analysis of these data. The work is published in Aitken and Macdonald (Applied Statistics, 1979). I am grateful to my supervisor, John Aitchison, for entrusting me with the problem.  When I moved to Strathclyde University in 1977, I discovered a very good forensic science unit there and I was able to develop my interest further in collaboration with the academic staff in the unit.

Whilst I was doing my PhD a paper was published by Dennis Lindley in Biometrika in 1977, entitled simply ‘A problem in forensic science’.   This is a very elegant paper which showed how the two important factors for comparing characteristics of traces found at a crime scene and in association with a person of interest, those of similarity and rarity,  could be combined in one simple form of the likelihood ratio.   Use of the odds form of Bayes’ theorem shows that this ratio converts prior odds in favour of a prosecution proposition relative to a defence proposition to posterior odds.  This paper gave me, and others, the inspiration to develop many other similar expressions for other data structures.

The most important development in which I was involved was a Bayesian hierarchical multivariate model for the likelihood ratio.   This was work done in collaboration with David Lucy, now sadly deceased, and published in Applied Statistics in 2004.  It has become the foundational paper for many methods for the estimation of likelihood ratios.  David developed an R function for the method and it has also been emulated in other software packages.

Early in my career I gave a talk at the Liverpool local RSS group about statistics in forensic science.   One member of the audience was a forensic scientist at the then Central Research Establishment of the Forensic Science Service, Ian Evett.   Ian had provided the problem and associated data for Dennis Lindley’s 1977 Biometrika paper.   He came along to my talk to find out what this newcomer had to say.   We went on to publish a paper in Applied Statistics in 1987 on a likelihood ratio for fibre transfer evidence.   The method published involved predictive distributions, an idea developed by my PhD supervisor, John Aitchison.  Ian has had a foundational role in the development of forensic statistics, in particular with relation to evidence interpretation and the application of statistics, probability and logic.   He was awarded a CBE in the 2016 New Year’s Honours list for services to Forensic Science.   He played a fundamental role in the development of my career in forensic statistics.

DNA profiling came along in 1985.   This led to a large increase in the appreciation by the bench and criminal bar of the use of statistical and probabilistic reasoning in the courts. 

DNA profiling came along in 1985.   This led to a large increase in the appreciation by the bench and criminal bar of the use of statistical and probabilistic reasoning in the courts.   It became easier for statisticians like myself to gain access to lawyers to try and persuade them that statistics and probabilistic reasoning had a role to play in criminal justice.  In 1987, Ian Evett suggested to me the time might be ripe, given the interest in DNA profiling, for an international conference on the role of statistics in forensic science, of which more later.

In the mid-1990’s I received a letter from a young researcher at the Forensic Science Institute at the University of Lausanne, the world’s premier forensic science research institute.   He was looking for somewhere to spend a post-doctoral period and wondered if I would be willing to host him.   I was always interested to have someone to do work for me and readily agreed.   His name is Franco Taroni (now a full Professor at the Institute in Lausanne) and his visit started a research collaboration and friendship which continues to the present day, about twenty-five years later.   We have published several books and many papers together.

Several colleagues from the Law School of the University of Edinburgh and from Glasgow Caledonian University set up a research centre, the Joseph Bell Centre for Forensic Statistics and Legal Reasoning in 2000 with funding from the Scottish Funding Council.  Perhaps we were ahead of the times, perhaps the three years’ initial funding was not long enough for its establishment but the Centre only remained active for about five years with most of its work in computer science.  One pleasurable outcome is that all three post-doctoral assistants we appointed went on to tenured academic posts.  The name is still attached to an annual seminar organised by Burkhard Schafer in the Law School.  

The work which the Centre foresaw is now carried on at a Leverhulme Centre at the University of Dundee and a Centre for Statistics and Applications in Forensic Science, a US research centre based at the University of Iowa, to name but two centres, with many other groups with PhD students and post-doctoral assistants and fellows scattered around the world.

The major contribution by statistics to the development of forensic science has been the introduction of Bayes’ theorem to the evaluation of evidence, as first proposed by Lindley in 1977, thought presaged at the end of the 19th century by the philosopher C.S. Peirce and developed at Bletchley Park in the Second World War. Bayes’ theorem provides a rigorous approach to the measurement of the effect (value) of evidence in updating the prior beliefs in the prosecution and defence propositions. The theorem has enabled the rarity of the evidence and the similarity of control evidence (evidence whose source is known) and recovered evidence (evidence whose source is not known and which may be from the same source as the control evidence) to be evaluated simultaneously on a continuous scale.

The first of two more large contributions by statistics to the development of forensic science is the introduction of Bayesian networks, first mooted for use in forensic science in 1989 in a paper by myself and Alex Gammerman, now of Royal Holloway College, in the Journal of the Forensic Science Society (now Science and Justice, the journal of the Chartered Society of Forensic Sciences) and the subject of a book by Franco Taroni and Alex Biedermann of Lausanne, Silvia Bozza of the University Ca’Foscari in Venice and Paolo Garbolino of University IUAV in Venice and myself in  2014. These networks provide an intuitively satisfactory way of evaluating propositions and several pieces of evidence diagrammatically with the use of Bayes’ theorem and suitable software.   There is immense scope for them to be used in the future to help evaluate the copious amounts of data available in this information age

The second large contribution is the concept of different levels of propositions for consideration, namely sub-source, source, activity and offence levels. This concept enables factors such as transfer, persistence and relevance to be considered in an objective manner. This idea was first proposed by Cook and others in 1998 in a series of papers in Science and Justice.

[Sorry – that is rather a long answer;  too many landmarks!]

Cocaine Crime Stats

MdC: You have written several witness statements over your career and been an expert witness in two trials related to drugs on banknotes. Can you tell us about the statistical challenges involved on all this, and on the communication of forensic findings? 

CAThe biggest difficulty I had was when I had a result which was statistically significant but not meaningfully significant.   Amy Wilson, a lecturer in the School of Mathematics, and an analytical company who had provided the data for drugs on banknotes and also supported Amy’s PhD, and I had done some research on the geographical variation in England on the quantity of cocaine on banknotes in response to criticism of the company’s database in a trial of first instance in Sheffield in 2014.   For example, the judge commented that “[i]t is a database of pure convenience” and that “[t]he assertion that notes from banks are typical is not supported by any evidence and is illogical”.   We did research, published in Forensic Science International, assisted by the Bank of England who provided notes from their distribution centres around England, that there was no meaningful significance in the variation in the quantities of cocaine on banknotes in general circulation around England.   The database was very large so almost any difference would be statistically significant and this was indeed the case.  What was meant by meaningful was determined by the company.  It was not something Amy or I could decide.

For a difference to be meaningful it is generally necessary for it to be statistically significant.   Statistical significance is not sufficient, however.  Unfortunately, lawyers have picked up on the idea of statistical significance and think it is necessary and sufficient to show a meaningful difference.  

Unfortunately, lawyers have picked up on the idea of statistical significance and think it is necessary and sufficient to show a meaningful difference.  

It came to pass that I was being examined under oath about the variation in quantities of cocaine on banknotes across England.   The barrister had noted that I had reported there was no meaningful difference.  He then wished to emphasise the point.  He noted, wrongly, that for there to be no meaningful difference the results had to be statistically insignificant.   Would I please confirm this was the case?  I could not without perjuring myself.  However, disagreement would open a philosophical can of worms which would probably lose the barrister, judge and jury and leave them with the impression there was meaningful variation.   Part of the problem was that I was in Switzerland testifying over a video link to a courtroom in Liverpool so it was impossible to assess how the judge and jury, whom I could not see, were understanding my testimony.  I extemporised, repeating the phrase ‘no meaningful significance’.  The barrister did not pick up on it and the examination finished there, much to my relief.

There have been amusing moments.  Two, in particular, spring to mind.

The first occurred when a Head of Chambers in London committed the prosecutor’s fallacy.  I explained that this was incorrect; a small probability for finding the evidence on an innocent person did not mean that there was a small probability that a person, on whom the evidence was found, was innocent.  The Head of Chambers responded  “No.  You must agree, Professor Aitken, that I am right and you are wrong”.   I replied “No, I did not agree”, which caused some amusement amongst the juniors around the table who understood what I meant and were wondering how their Head would respond.   The conversation moved on.  About five minutes later, the Head stopped in full flow to note “Yes, Professor Aitken, you are right” and then went back to what he had been saying.   My satisfaction in his acceptance of the fallacy was tempered by the lack of an apology.

The other amusing moment occurred during the video testimony from Switzerland to which I referred above.   The defence expert was testifying in rebuttal of my evidence.  I was still visible to the court on a large screen so had to maintain a straight face throughout.   Alongside me was a colleague who was looking after the IT side of affairs but who was also an expert in forensic statistics.  At one point in the defence expert’s testimony he passed a note to me which said, in effect “He’s talking nonsense”.   I agreed but it took a great deal of effort not to smile.

MdCYou have led a project sponsored by the Nuffield Foundation to produce a series of reports for the criminal legal profession on probabilistic and statistical reasoning in criminal proceedings. Can you tell us more about this?

CA: Four reports were produced over a period of four years from November 2010 to December 2014, under the auspices of the Royal Statistical Society (RSS) with sponsorship from the Nuffield Foundation.  They covered topics, such as the fundamentals of probability and statistical evidence in criminal proceedings, the assessment of the probative value of DNA evidence, the logic of forensic proof with consideration of networks for structuring evidence and the case assessment and interpretation of expert evidence.  The overall title was ‘Communicating and interpreting statistical evidence in the administration of criminal justice; guidance for judges, lawyers, forensic scientists and expert witnesses’.   The reports are available as pdf files from https://rss.org.uk/membership/rss-groups-and-committees/sections/statistics-law/

Scottish and English criminal adjudication is strongly wedded to the principle of lay fact-finding by juries and magistrates employing their ordinary common sense reasoning.   However, it cannot be assumed that jurors or lay magistrates will have been equipped by their general education to cope with the forensic demands of statistics or probabilistic reasoning.  It is sometimes claimed that lawyers and the public at large fear anything connected with probability or statistics in general but irrational fears are no excuse for ignorance in matters of such practical importance as the administration of criminal justice.  More likely, busy practitioners lack the time and opportunities to fill in persistent gaps in their professional training.  Others may be unaware of their lack of knowledge or believe that they understand it but do so only imperfectly.

The reports aimed to fill this statistical and probabilistic gap in UK forensic practitioner guidance.  The reports were written by a multidisciplinary team with myself and Paul Roberts (Professor of Criminal Jurisprudence at The University of Nottingham) as the principal investigators for the Nuffield Foundation grant and with Graham Jackson, Sue Pope and Roberto Puch-Solis as co-authors.  They were produced under the auspices of the then Statistics and Law working group (which is now a Section)  of the RSS whose members included representatives from the judiciary, the English Bar, the Scottish Faculty of Advocates, the Crown Prosecution Service, and two institutions that no longer exist, the National Policing Improvement Agency and the Forensic Science Service.  Meetings of the group had some very lively discussions, always conducted in a friendly manner.   The group secretary, an officer of the RSS, commented that he wished he could sell tickets to our meetings!  The reports benefitted from input from an international advisory panel with forensic scientists, lawyers and statisticians from Australia, New Zealand, Switzerland and the USA.  Advice on criminal litigation was provided by His Honour Judge John Phillips, Director of the Judicial Studies Board for England and Wales and by Sheriff John Horsburgh for Scottish law and practice.

We like to think the reports were well received.   The first report received a favourable mention in a 2011 report by the Law Commission of England and Wales on expert evidence and all four reports were mentioned in a judgement by the Kentucky Supreme Court in the USA in 2014.  Other reports on the same topic from other sources have since appeared.  The Inns of Court College of Advocacy and the RSS combined to produce a report on statistics and probability for advocates.   The Royal Societies of London and Edinburgh in conjunction with the judiciary are producing a series of primers to assist the judiciary when handling scientific evidence in the courtroom.  The European Network of Forensic Science Institutes have also produced a guideline for evaluative reporting in forensic science.

There is no doubt that there is an increasing acceptance of the benefits of sound probabilistic and statistical reasoning in the administration of criminal justice.   

There is no doubt that there is an increasing acceptance of the benefits of sound probabilistic and statistical reasoning in the administration of criminal justice.    The provision of our reports and the subsequent guides and reports will help the understanding of such reasoning amongst legal practitioners.

David Finney
David Finney

MdC: Two other prominent Professors of Statistics of the Department have been Prof. David Finney and Prof. Alexander Aitken. Have your paths crossed somewhere in time?

CA: David Finney appointed me as lecturer in 1979 and we were colleagues until his retirement in 1984.  I was his last academic appointment.  After his retirement, I would meet him socially at seminars and the like.  I also visited him occasionally at his home in Edinburgh when he had ceased coming in to the Department (as it then was, not a School).   I was honoured by his attendance at my retirement party in December 2016.

I never met Professor A.C. Aitken, much to my regret.   I heard older colleagues talk of him.   I have read his book on his experiences as an infantryman in the First World War From Gallipoli to the Somme, an excellent book on the horrors of war.  I also inherited David Williams’ copy of A.C.Aitken’s Statistical Mathematics, a very interesting book written before the days of computers with many suggestions for the simplification of calculations.

Alexander Aitken
Alexander Aitken

MdC: I guess you must have been asked quite a few times whether you are a relative of Prof. Alexander Aitken?

CA: I was first asked about my relationship with A.C.Aitken whilst at school in Glasgow, over 50 years ago.  Alas, we are not related.  He was a New Zealander but his family's roots were in Central Scotland.  The surname Aitken is a fairly common Central Scotland name (and rare in the rest of the world).  It is not to be confused with various English versions such as Aitkin or Atkin.  Also, note the 't' is silent, which has led to confusion when checking into conference hotels in England: ‘Sorry we have no Aiken registered, there is an Atkin but no Aiken’.  Several years after my appointment to Edinburgh, David Finney asked me if anyone pronounced my name with a silent 't'. It was with a certain degree of mischievousness that I replied 'Yes, I do!'.

MdCYour books are seminal references on the field. Tell us a bit about these.

CAThere are five books in which I have been involved as an author or, in one case, as an editor. All were motivated by a desire to bring together in one place the ideas relating to a particular theme which were scattered amongst the literature.   They are aimed at practitioners, mainly forensic scientists though some parts would be of interest to criminal lawyers also.   Without an appropriate book, these practitioners and lawyers would have difficulty learning about a topic.   The material would be in many journals and it would be difficult to distil into a coherent thesis about the topic.

The advent of DNA profiling in the mid 1980’s lead to an increasing interest in the use of statistics and probabilistic reasoning in forensic science.   Thus, it was in 1991, Dave Stoney, a US forensic scientist, and I edited a set of essays on the general theme of The Use of Statistics in Forensic Science.   Thirty years later some of it is obviously dated.   However, two essays in particular are favourites of mine, one by Dennis Lindley on ‘Probability’ and one by Jack Good on the ‘Weight of Evidence and the Bayesian Likelihood Ratio’.  These have stood the test of time and I turn to them often for the quality of the writing on these topics.  Amazingly, the book is still being purchased.   My only regret is that the publishers put a blood-stained histogram on the cover, a design about which I was never asked and which is not particularly relevant to the contents as histograms played little part in the book.  For all the other books I have ensured that I was consulted about the design of the book cover.

The second book followed a few years later in 1994.   I thought the time was appropriate for a general book on forensic statistics.   Thus was born 'Statistics and the evaluation of evidence for forensic scientists' which has become the main book on the subject and used for teaching in Master’s courses in forensic science and cited in many research papers.   The first edition ran to 260 pages and ten pages of bibliography.   The second edition appeared in 2004 with Franco Taroni as co-author and was 500 pages long.   Franco and I, along with Silvia Bozza, have just finished the third edition.  It appeared in December 2020 and is over 1200 pages long with over 90 pages of bibliography.   The growth of the subject has been huge.  The book has more than doubled in size with each edition, a progression which may make a publisher reluctant to take on a fourth edition!

The other three books in which I have been involved are more specialised.  The first two have Franco has the lead author, myself, Silvia, Alex Biedermann and Paolo Garbolino as co-authors.  One, published in 2010, is concerned with decision theory in forensic science, where there is interest in the utility of an interpretation of the weight of evidence, a book motivated by a comment of Dennis Lindley’s in his Foreword to Franco’s and my 2004 book that our ‘near-silence’ on decision-making in the book was ‘disappointing’.  The second, the second edition of which was published in 2014, is concerned with Bayesian networks where the associations amongst various pieces of evidence and the propositions of the prosecution and defence are represented in a Bayesian network.  The third, with Grzegorz Zadora of the Institute of Forensic Research in Krakow and the University of Silesia at Katowice as the main author and Agnieszka Martyna of the Institute of Forensic Research and Daniel Ramos of the Universidad Autonoma de Madrid and myself as co-authors, is concerned with the evidential value of multivariate physicochemical data.   It is an applied book and there is a large amount of R code and many examples, with copious explanations for the calculations of evidential weight.

I hope these books serve their intended purposes well.

MdCYou have been much involved with editorial work as the founding editor-in-chief of Law, Probability & Risk. What have you learned from that experience?

CAMy first advice to anyone considering starting a new journal would be “Don’t!”.  It is a lot of hard work and nerve-wracking.   Is anyone going to write for the journal?   Is anyone going to read it?  I stepped down from the role a few years ago and have been lucky to be succeeded by, first, Joe Gastwirth of George Washington University and, now, Anders Nordgaard of the Swedish National Forensic Centre, both of whom have done excellent work.   In my time, I was fortunate as Editor-in-Chief that very good people were willing to be editors and others to join an editorial board.   Then these people and others were willing to submit papers.  Gradually the number of submissions grew, libraries purchased it, and citations appeared, leading to a virtuous circle and the award of an impact factor, currently (September 2020) 1.059 . However, the worries about the supply of papers never go away.

The motivation for its foundation, as with other journals I suppose, was consideration of the dissemination of knowledge.   There was no obvious organ for communication of some of the work that I and others were doing.   Work which was obviously statistical could be published in the journals of the RSS, for example, and there were international journals for publication of work which was obviously of a forensic scientific nature.   However, I felt there were ideas which fitted neither discipline and which were important for lawyers, as distinct from statisticians and forensic scientists, to know about but for which there was no one place that could act as a focus for this work. 

You may also be interested in the development of an international conference.

I remember the occasion well.  It was during the International Association of Forensic Sciences meeting in Vancouver in 1987.   DNA profiling had been around for a couple of years and had revolutionised forensic science and raised the profile of the use of probabilistic thinking in evidence interpretation.  Ian Evett approached me and suggested the time might be ripe for a conference on the role of statistics in forensic science.  I agreed to host a conference in Edinburgh and the first international conference in forensic statistics was held in Edinburgh in 1990.   It was then and still is now the only international conference to bring together lawyers, forensic scientists and statisticians to discuss matters at the interface of the three disciplines.   The conference has gone from strength to strength with widespread international participation and healthy competition to host the conference.   It is held every three years, alternating between Europe and the USA. Sadly the 2020, thirtieth anniversary, conference has had to be postponed because of Covid-19.  Selected papers arising from the conference are published in Law, Probability and Risk. 

MdCCan you tell us about your view on the interface between law and Statistics and on the future of forensic Statistics?  

CA:  It makes more sense to ask about the interface of law, forensic science and statistics, not just the interface between law and statistics.  Also, perhaps the two topics, interface and future, could be the subject of separate answers.

InterfaceThe importance of statistics to the law is not new.  In 1897 in the Harvard Law Review, Oliver Wendell Holmes Jr. a US Supreme Court Justice wrote that, “For the rational study of the law the blackletter man may be the man of the present, but the man of the future is the man of statistics and the master of economics.”  This is a wonderful statement and one of which current jurists should take heed.

When I started in the late 1970’s there were few people working at the interface of law, forensic science and statistics.  Now there are an international conference, a journal, research centres, many books, scientific and legal papers and university courses for lawyers and forensic scientists, with PhD students, post-doctoral fellows, many tenured academics and much discussion in Appeal Court judgements.

This interface is a fascinating place in which to work.  There is the opportunity to develop statistical models of some complexity for interesting data.   There is a clear application, the administration of criminal justice, with strong pressures not to mess up!   In addition, there is a need to be able to explain sophisticated inferences in a language that can be understood by generally highly intelligent people, lawyers, whose choice of profession for some has been made partly because it did not involve mathematics.  Only occasionally have I had to testify as an expert witness.  When one does, the evidence has then to be comprehensible to a jury, a random sample of the general population, a difficult challenge!

Future: As regards the future, Ian Evett and Bruce Weir, a statistical geneticist at the University of Washington, many years ago stated three principles for evidence interpretation.

  1.  To evaluate the uncertainty of any given proposition, it is necessary to consider at least one alternative proposition.
  2. Scientific interpretation is based on questions of the kind ‘What is the probability of the evidence given the proposition’.
  3. Scientific interpretation is conditioned not only by the competing propositions but also by the framework of circumstances within which they are evaluated.

Ideas for the evaluation of new forms of evidence in the future will have to follow these principles and, as shown by Jack Good, it is the likelihood ratio or a function of it that should be used for the evaluation.

The main challenge for forensic statistics in the future surrounds digital evidence and big data.

The main challenge for forensic statistics in the future surrounds digital evidence and big data.  There are obvious specific examples in image analysis, such as facial recognition, speech recognition, and gait analysis with its additional complexity over image analysis of movement.     However, the challenge is much, much bigger than that.   Consider the Evett / Weir principles and computer hacking.  

  1. What are the propositions?    Three come to mind.  X hacked the computer systems of Y; Someone else hacked the computer systems of Y; No-one hacked the computer systems of Y.
  2. What is the probability of the evidence given the propositions?   What is the evidence?   What probabilistic model can be determined?   How might the performance of the model be assessed?  How is it known to be a good model?
  3. What is the framework of circumstances?   If the hacking is related to e-mails, what background e-mails should be considered against which the suspicious nature of the questioned e-mails could be compared?

In the long term I would like to see forensic statistics develop as a discipline comparable to medical statistics with a good level of research funding.   

In the long term I would like to see forensic statistics develop as a discipline comparable to medical statistics with a good level of research funding.    It would also be good to see statistics and probabilistic reasoning, perhaps along with evidence interpretation and evaluation, included in the training of lawyers.  This is an aspiration suggested by the Law Commission report referred to earlier:  “ideally, law students would in due course receive instruction on scientific methodology and statistics as part of their undergraduate courses, and the CPD requirements for practising solicitors and barristers who undertake work in criminal law would be amended to require attendance at approved lectures covering the same areas (in the context of criminal proceedings)”.

With an established discipline of forensic statistics and a bench and bar with appropriate exposure to reasoning with uncertainty as part of their professional training I believe the opportunities for miscarriages of justice will be greatly diminished. 

MdC: Thank you very much, Colin. It has been an honour.