022 – Evidence, Research and Society

The IDEMS Podcast
The IDEMS Podcast
022 – Evidence, Research and Society
Loading
/

Description

David and Lucie discuss the role of evidence in research and in our wider society. What is good evidence? How should evidence be used? The last decade has seen the erosion of public trust in facts – so what can and should we be doing differently? 

[00:00:00] Lucie: Hi and welcome to the IDEMS podcast. I’m Lucie Hazelgrove Planel, an anthropologist and social impact scientist, and I’m here today with David Stern, a founding director of IDEMS. Hi David.

[00:00:18] David: Hi Lucie. I’m looking forward to another interesting discussion. What are we planning to discuss today?

[00:00:24] Lucie: I thought we could discuss evidence. So evidence in research and in terms of research methods, and what is good evidence? Who is evidence for? Why is evidence important? I have lots of questions that we can discuss.

[00:00:43] David: Good philosophical questions here. I look forward to an interesting discussion and I hope nobody thinks I have any answers.

[00:00:52] Lucie: But that’s exactly what we were expecting!

[00:00:54] David: Good. Yes, I think there’s some really interesting ones and I’m always surprised by how complex evidence is in so many different contexts and what really can constitute evidence. We’ve discussed in other contexts some of this in other podcasts and it’s something which I’m sure will continue to come up in many other podcasts but I think one of the starting points of evidence has to be this recognition of the scientific method and the importance of a scientific method in establishing evidence, and the recognition of the value that has brought to societies all over the world.

Don’t get me wrong, there are serious problems with scientific research. We will discuss this in a number of podcasts and have already touched on some of these issues. The scientific method, I would argue, is central to understand how we establish evidence.

[00:02:01] Lucie: Yeah, that makes me think that there’s sort of good evidence and bad evidence, so where bad evidence has not perhaps followed a scientific method, and so results can’t be reproduced.

[00:02:11] David: Well, it’s not just about reproducibility, although of course that is a really important element of evidence and research. Maybe I should clarify what I mean by the scientific method, because there’s lots of different ways this could be interpreted. I would argue that what really distinguishes the scientific method from many pseudo-scientific approaches, is in the scientific method you can never show something is right. You have your hypothesis and you can have evidence which basically shows your hypothesis is wrong.

Newtonian mechanics is a fantastic example people can understand. It’s established using the scientific process and had a lot of evidence, experimental evidence, all sorts of different evidence. But it’s never that it was right.

It was always that within the context that it was being tested and that it was being evaluated, it was not yet wrong. Until it was wrong. People found this sort of thing, the stars and looking at light and how things moved in different ways. And this is where actually Einstein’s theories of relativity came in to fill the gaps when Newtonian mechanics was wrong. Not just in the apple falling as in Newtonian mechanics, but as in bending light, you know, that sort of thing. This is, you know, bending space time, as you think of in terms of Einstein’s theories of relativity. The limitations of a given theory were found and new theories were developed to be able to say, okay, Newtonian mechanics breaks down when you go very, very fast, when you consider them very, very small. Very, very small is interesting because that’s when you get to quantum mechanics and quantum mechanics is a whole other set of rules and so on.

I love the idea that with the scientific method we don’t know that any of these theories are right. We just know that in the context within which we use them and we’ve tested them, they’re not yet wrong. So we haven’t yet found the context within which they’re not right. And that’s a good scientific approach. That actually a really really interesting scientific discovery, might be to identify a context within which an established scientific process doesn’t seem to work, or things don’t work in the same way as you would expect them from a standard scientific process. That would be an incredible discovery.

Now you might not yet have a theory to replace it, to say this is how it does work, that would be a next step, but it would be a fantastic scientific discovery to identify a context within which your best current knowledge does not apply. And as a scientist who has a theory, finding context where your theory seems to be wrong or does not seem to apply, that’s not something to brush under the carpet, that’s something to celebrate. That is central to what I would see as the scientific approach, that your theories are there to be found to be wrong, not to be proved to be right.

With pseudoscience, you have a theory and you try to find evidence that proves your theory is right. With the proper scientific process you have a theory and you try to find evidence that your theory is wrong and if you cannot find evidence that your theory is wrong, then you say you do not yet know why your theory is wrong and so it’s the best theory you have to date.

[00:06:26] Lucie: Okay, so if I understand correctly I mean, let’s say you have a piece of evidence, and let’s say that the quality is good, whatever that means, we haven’t yet gone into that, it can be used in a good or a bad way.

[00:06:39] David: Yes.

[00:06:40] Lucie: So I’m not so used to this sort of having a hypothesis before doing research. I can understand it in agricultural experiments if you’re trying to test you know, the effect of adding compost to something, to a plant. I can see there how the results of your experiment will give you evidence of something… but you shouldn’t…

[00:07:05] David: Let me take that particular example. You have a hypothesis that this new way of doing compost is better than your old way of making compost.

[00:07:16] Lucie: Yeah.

[00:07:17] David: And you’ve got an innovation which relates to how you produce the compost, which you believe will lead to these improvements.

[00:07:25] Lucie: Is it just that you say, it seems that it confirms my hypothesis…

[00:07:31] David: No.

[00:07:31] Lucie: …but you haven’t yet got evidence to the contrary?

[00:07:34] David: It’s not about confirming your hypothesis. That’s the point. So the point is, your hypothesis is that this particular approach to making compost is going to then mean that there’s more nutrients which are available to the plant and therefore the plant grows better. And so you then do your experiment where you take your old way of doing compost and you compare. Of course experimental method is the only way to determine causation. So in this particular case, we’re looking to get the fact that actually the new way of making the compost is causing some form of improvement to the plant.

And you’re doing this in different ways. And what you find is indeed that once you’ve done this, that as you expected your new way of doing compost as the theory predicted, because you followed the theory and whatever, and you have a hypothesis as to why this is better as the theory predicted, you get your expected result. And so as a scientist, you say, yes, our theory doesn’t yet have anything to contradict it.

And then you get a result, you do this in different experiments, and then you have a result where you say, well, actually in this context, in this situation, our theory that this would be better didn’t hold. Now, as a good scientist, that’s the one you’re interested in. Because then you’re interested in asking, okay, why isn’t it better in this context? Why didn’t it? And that’s what you want to dig into and understand. Because if you’re a pseudoscientist, then you actually brush that under the carpet. Oh, that was just an anomaly.

The scientific approach, in the scientific method, you’re looking to try and understand why your theory is wrong. Your new way of making compost should have increased the nutrient availability to the plant. But in this particular context, for whatever reason, that didn’t actually have an effect. Maybe that’s because in that soil where you applied the contents, nutrient availability was not a limiting factor. There were other limiting factors. And so maybe actually in that context, because of the drainage of the water going through, that increased nutrient availability actually meant that more of the nutrients got washed through the soil and actually left the soil, rather than getting absorbed by the plant. And so actually it had a different effect in that particular context.

I’m making all this up. Yeah? I don’t actually know what’s happening in different ways in terms of nutrient availability, but I do know that, you know, this is an important area of actually how plants absorb nutrients and sort of grow.

But I think that’s what’s really important that as a scientist, if you get a result which is exactly what you expected, that’s not very interesting.

[00:10:16] Lucie: Yeah.

[00:10:17] David: What’s really interesting is when things go against your expectation. That’s when you want to dig in and understand.

[00:10:24] Lucie: And we use the, I know within our team we sort of talk about being a data detective.

[00:10:29] David: Yeah.

[00:10:30] Lucie: Which is absolutely a concept that I, I really agree with and sort of enjoy. Because it’s all about trying to understand, as you’ve been saying, what is actually going on, what is, what the data is telling you, as opposed to what you are interested in finding out, perhaps.

[00:10:45] David: And that’s the key point, and this is where the good scientific method is… you’re not trying to look at how the data confirms what you think is true. You’re trying to use the data to disprove what you think is true. And if you fail to do so, you say, I still can’t disprove it in some sense. And so this is still the best hypothesis I have.

Let’s think about this a little bit in context, which are more anthropological.

[00:11:18] Lucie: Thank you.

[00:11:18] David: And if I understand, correctly, and you correct me if I’m wrong with this, broadly there are multiple anthropological methods and some you would go in with hypotheses or things that you’re looking for, understandings that you’re looking for, and so those theoretical understandings, which you’d have got from the literature and from the theories you go in with. And other methods, you would actually try to avoid doing that and try to actually present yourself as a really objective observer and in a different way.

I’m going to focus on the first, where you actually have a theoretical background and you’re going in and you are looking to test what your theoretical knowledge is telling you about the environment you’re going to observe. Because that’s where I feel that the scientific approach applies in the same way. As an anthropologist going in with a theory of how things work, whatever it might be. When you go in, you don’t look just for evidence which confirms what you’re looking for.

[00:12:17] Lucie: Nice.

[00:12:18] David: That would be totally wrong.

[00:12:20] Lucie: Yes, that would be very strange.

[00:12:22] David: Yes. You look in particular, and you would probably pay particular attention to things which seem to contradict what you would have expected.

[00:12:31] Lucie: Yeah.

[00:12:33] David: Yeah, that would be central to the anthropological approach. This is the scientific method. This is why anthropology is a scientific discipline, just like so many others. It is built on the same scientific methods.

[00:12:48] Lucie: And it really surprised me that when I first started doing anthropology, not having known much about what anthropology was other than that it was the study of how sort of people work or like how people are in general.

[00:13:04] David: But this is the thing and this is what’s so powerful about these scientific methods. They are universal.

[00:13:11] Lucie: Yeah.

[00:13:11] David: It doesn’t matter whether you’re doing nuclear physics or anthropology, which couldn’t be further from each other. But underlying them, you have the same powerful scientific principle. Now what has been so difficult for me and for many others over the last few years, and this goes back quite a long time, is the eroding public understanding of evidence and science.

[00:13:40] Lucie: Oh, interesting one, yes.

[00:13:42] David: I mean… when… when politically people talk about alternative facts as if that is okay, this is something where there’s always been a political element to how we, we determine, we use evidence, and so on. But more recently, particularly in the UK and the US, this has got to an extreme, which has just eroded public confidence in science, in evidence, in experts, to the extent where I’m really worried. And I think there’s whole elements of society who, they’re well educated, they’re highly intelligent, they’re well informed in many ways, but because of this bombardment of pseudoscience or alternative facts, or…

[00:14:37] Lucie: It’s not only about science, but it’s where sort of things get politicized and where basically culture starts influencing and ideologies start influencing people’s ideas.

[00:14:49] David: Well, I agree, but, to me there’s something more fundamental about that when you cannot actually understand evidence, when you can’t accept evidence. And the point is that part of this is the fact that evidence in certain ways got used to present things that weren’t actually true in ways that were actually people were presenting false evidence. And understanding how that erodes public confidence in different ways. These issues of being able to actually make sure that evidence is independent of politics and of ideology and beliefs. Evidence is something, it’s the foundation of our society, because all technology which we use has come out of scientific evidence and advances, where we have confidence in them, where we understand with confidence how they will reliably work, because of the scientific processes behind them.

I don’t want to get sidetracked too much by this. I want to focus on the evidence. And the danger of not having societal norms that establish evidence independently of politics and ideology is so important. But when in the political discourse, the elements of science and evidence get weaponized, and this goes back a long way. I mean, tobacco industry?

[00:16:34] Lucie: Yes! There’s that great film, is it Thank You for Smoking?

[00:16:40] David: I don’t know if I’ve seen that one.

[00:16:42] Lucie: Ah, it may not be great, I mean, I watched it a while ago.

[00:16:47] David: I do know that there’s been so many documentaries which have highlighted how the tobacco industry basically, used all sorts of tactics to be able to discredit scientific effort.

[00:17:05] Lucie: Yeah.

[00:17:05] David: These mechanisms have been taken up by the oil industry related to climate change.

[00:17:11] Lucie: I was just thinking of that.

[00:17:13] David: And the same approaches have now been taken up at a societal level. So just as our ability as a society to actually determine the difference between what we know and what we don’t know has increased, so has the ability of communication mechanisms to be able to discredit what we actually know and create confusion, and really much faster than our ability to communicate scientific evidence.

It could be said that in many areas I know myself and highly intelligent people who are experts in their domain, getting caught up and misled by things which are communication tools rather than scientific approaches, and that we cannot tell the difference in areas and domains where we are experts.

And that, for the general public, is terrible, because they’re not experts at anything, in a sense.

[00:18:14] Lucie: So is it a problem with research or is it a problem with I guess media or how politicians and how other people, not researchers, use evidence?

[00:18:26] David: Both. There were problems at every level of this. There’s a problem, more fundamentally, about what we can know and what we can’t know. This is something which is really rather difficult. Economics is a very easy scapegoat for this, but I’m going to use it.

[00:18:43] Lucie: I’m sure economists will be happy about that. They’re not here to defend themselves.

[00:18:48] David: Many of the economic theories are based on simplifications of a world which is much more complex than they’re dealing with. And that’s the nature of what they’re trying to do. And what this then leads to is people making complex decisions using the basis of these theories, where the outcome is always going to be imperfect. And it’s not that there aren’t economists finding alternative theories.

The first female Nobel Prize in economics was a lady called Ostrom, and she did incredible work which basically demonstrated that it isn’t about a choice between big government and free market. There are other ways to manage resources. So it’s a more complex problem, a very oversimplification of what she found out and she developed and showed.

But isn’t it interesting that all our political systems are still two party systems, well not all, but very few recognize that this isn’t about left and right. And yet, she got a Nobel Prize in Economics years and years ago for demonstrating that it is not a single line, one dimensional problem with two ends to it. There’s at least a third one, which means that we’ve got at least a two dimensional sheet. We should be thinking and working in it, maybe more. And there’s been all sorts of other theories about this in different ways. But it’s clear that it’s not just a one dimensional problem.

And yet, all our political systems are still based on the economic theories which basically make it a left versus right, big government versus free markets, in different ways. And that’s at the heart of most political spectrums that we look at.

[00:20:33] Lucie: Yeah.

[00:20:33] David: And yet, we know, as scientists, as intellectuals who understand the theory, that it’s not as simple as that. And that we shouldn’t be thinking in those simple two, those simple two choice options. And yet, our structures are still built on that. And a lot of the economic theory which is coming out is really about elements on that simplistic plank. And a lot pushed towards silver bullet type solutions.

And that, to me, is the key. And I want to take the second female Nobel Prize in economics, which was Duflo, for her work on bringing randomized control trials to international development. And one of the things, as positive as some elements of this has been, actually, not through a fault of her own, but through the impact that this has actually had on international development agencies and all sorts of others, actually, I’m sorry to say that from my perspective, this has done more harm than good. Because it’s narrowed the research methods used to get development studied. It means that everybody has tended to fit into a very narrow view of how we learn and how we get knowledge. I don’t believe that was what was ever intended.

Yes, there is a great value which can come in specific contexts to using randomized control trial as well in international development. And she did some fantastic work in this area, but there’s a whole range of powerful research methods that can and should be used, and it should be one of the tools in the toolbox.

And yet, not through a fault of her own, but through how this has been taken up, it’s made all studies very similar, when actually, often the studies need to be different because they’re trying to find out different things, and the context is different, and what’s known is different.

In medical research, the randomized controlled trial plays a very specific role in a long chain of research processes. And it’s needed at a very specific point to be able to assert, and that’s where this has sort of come from as a methodology, as she adopted it. But what’s happened in international development is, all the other stages have been lost.

And all the other types of research have been lost, whereas all of them are valuable. But that combination, that balance between them, is what really gives value. And we’ve had colleagues who we work with who have found themselves in the middle of a trial, which was supposed to be a randomised controlled trial, and where the results are really great in many different ways, but because there’s been contamination, because of the positive elements of the results, they’re worried that the results may not have value.

But of course they have good scientific value. Of course there’s wonderful value which is coming out. But they may not have the societal value because as a society we’re not valuing the range of things and evidence as we should. And that to me is a problem. So I think there’s this element about evidence, what we can know, what we can’t know, what we should be getting evidence for, what we shouldn’t, how we should be using that evidence, all of these are problematic in our current form.

Don’t get me wrong, there has been immense progress, but I would argue that we are very immature as a global society in how we treat and how we use the evidence that we are capable of generating. And I think, I hope, over the next 20 to 50 years, I will live to see us develop societies which use evidence better and which recognize evidence and communicate it better.

[00:24:41] Lucie: Yeah, I would love to go into some examples at some point about this. It feels like a really big topic.

[00:24:49] David: You’re thinking particularly in the use of evidence?

[00:24:53] Lucie: Well, I mean, you said treating evidence too, recognising evidence, you know, what does these, what do these things mean?

[00:25:01] David: Let me take a very interesting example for this. So there’s been some work which was done a number of years ago by a guy called Richard Wilkinson, wrote a book called The Spirit Level, did a very nice TED talk on it. And broadly, his work over many years looked at really investigating in high resource environments the impact of wealth on indicators of social well-being. And broadly what he found is that there was very little correlation between wealth in high resource environments, OECD countries, and indicators of social well-being in general. And he had a whole different way of looking at this. But where he did find there was correlation was between social inequality and indicators of social well-being.

[00:25:56] Lucie: Okay, and I see he co-authored the book with Kate Bickett.

[00:26:01] David: Yes, that’s right, with Kate Bickett, sorry, yes. They presented this, and I must admit I find the book quite a dry read, so it’s not the most engaging book to read through. I think the TED talk basically gives you the key points really in a concise way because he just goes through again and again in different ways. And his work has been critiqued by others but I think by and large it’s stood the test of time in many different ways in terms of actually other people finding in other contexts that, again, indicators of social inequality do tend to be correlated in interesting ways with indicators of social well-being.

[00:26:41] Lucie: Yeah.

[00:26:41] David: What it broadly means is that the more unequal your society, if it’s a wealthy society, the worse your society seems to perform with respect to many of these indicators of social well-being. This has been out there for a long time and yet I’ve very rarely seen people taking this evidence and actually saying and presenting this in a format which would lead to, let’s say, policy. So if you think about what this might imply for governmental policy, this might imply maybe Government should be competing less on growing the GDP, for example, which is a standard indicator of growth.

[00:27:26] Lucie: Absolutely. And there’s so many other different indicators that could be used to measure how well a country is doing.

[00:27:32] David: And recognising that actually the growth at the expense of social inequality might actually be detrimental to the country. This is exactly what’s happening in many developed economies at the moment, in the UK our indicators of social inequality are going up and there’s a whole narrative around driving growth. And yet it’s known in some ways that, for many of the society, a lot of the society would be better off if they supported policies and, not just policies but ideologies, I would argue in this particular case, which enabled us to build from what we know. What we know is if there isn’t enough to go around, it’s very different.

But if you’re in a high wealth environment, and there’s others who have thought about this in different ways. There’s, you know, donut economy, there’s all sorts of other interesting things which have come out related to this, but they’re not quite the same in the same way. But I think one of the things which is so important, which I think comes back to their work in the spirit level. If we think of this in terms of what it might mean to society, it does not mean we want societies where there is not social inequality. It means that we don’t want extreme social inequality. That’s what the evidence points to.

[00:29:06] Lucie: Yeah.

[00:29:07] David: There is absolutely no point in trying to get an equal society, but there is a lot of value in reducing the extreme inequalities that can emerge in certain societies. And I think that’s the sort of thing where you have billionaires in the US asking to pay more tax. There’s a whole set of signatories that asked for this. Why the governments can’t respond to that, where you have a demand from certain groups to do this in ways which actually, not all billionaires would be in favour of this, of course. But why you can’t get those sorts of sensible policies where people who have extreme wealth and are wanting to contribute to society in a way which is different and better and more positive, why that cannot be embraced by society is something which I struggle to understand. The evidence is behind it.

Actually many of the people who would be negatively affected by it are behind it. And yet as society, we somehow can’t embrace this. And this is not just true in the US, it’s also true in the UK, and it’s something where our political systems seem to be built around self interest in ways which I don’t understand, but mean that the evidence is not able to be listened to.

[00:30:36] Lucie: We’ve come back to how we don’t really recognise the value of evidence.

[00:30:42] David: There’s evidence in different ways of what’s needed, and of what we could do, and yet we don’t follow the evidence. And not just as a society, but also as individuals. That’s not how we work as individuals and how we work as societies. And I’m not saying that everybody should be following the evidence all the time, on everything of how you become as healthy as possible. No, that’s not the point at all. It’s how you recognise and value evidence is to me critical. Recognising evidence, part of what this relates to, and there’s a whole group who recognise this and bring this out, is that part of the point is uncertainty.

Actually, one of the reasons that we struggle to embrace evidence is because, as humans, we really struggle to deal with uncertainty. And the whole point is that most evidence we have talks about what might happen on average.

[00:31:51] Lucie: Yeah.

[00:31:52] David: But that’s very different to what happens to an individual.

[00:31:55] Lucie: Yeah.

[00:31:57] David: What happens on average and what happens to an individual are two totally different things.

[00:32:06] Lucie: I feel like you’re sort of putting into question the whole point of evidence in itself. I mean, you’re not at all, but if you keep that in mind when you look at most sort of stats or most research results which come out, then, I guess research results are generally models and models are always incorrect.

[00:32:25] David: Well, but it’s not just that, it’s the fact that actually, what is evidence? Let me give a context the other way around, where people, this was a long time ago, when they were building airplane seats for the average pilot, they built the ideal seat for the average pilot. Which meant that every single pilot was uncomfortable, because no pilot was average in all dimensions.

This took a long time for people to realize and recognise. But that’s so blindingly obvious. No pilot is average in every dimension. So by having a seat which only fits the average pilot, it was terrible. Anyway, but evidence is not to know what happens on average.

[00:33:14] Lucie: No.

[00:33:15] David: Good evidence is different to that. Knowing what happens on average is not evidence. This is where so much research is actually not good research because people misunderstand what they’re actually observing. Whatever it may be that you’re looking at, something you care about, are you wanting to know what happens on average, or are you wanting to know what happens in the extreme cases? And that’s where good evidence knows how to look at this.

[00:33:43] Lucie: Yeah.

[00:33:43] David: This is the thing, good evidence is hard to come by. And there’s a lot of things where people want answers, but we don’t have the ability to provide evidence to give those answers necessarily. Because people are looking for certainty in contexts where you can’t give certainty. I want the evidence of this so that I can know what I should do.

No, that doesn’t work. It doesn’t work like that. What you as an individual should do you’re an individual. What, on average if people do this, on average if you play the lottery you will lose. Does that mean you shouldn’t play the lottery? No. You know, you might win. It’s very unlikely for you to win, it’s more likely that you won’t win than you will win, and on average you will lose, because that’s why the lottery makes money. But while you have that ticket and you don’t know, you have hope. You have things which are actually valuable, irrespective of whether you win or not. You can dream of what would happen if you did win.

Of course, the irony is, I think most studies of lottery winners find that they’re actually not happier for having won. And that’s the real irony. If you think about this, it’s, it’s mind blowing, but it’s fantastic. Actually, the value of the lottery is not necessarily for the winners, they’re the unlucky ones. If you win the lottery, then actually the evidence shows you tend to become unhappier, and you tend to actually have your marriage split up, and all sorts of other things, all sorts of negative long term outcomes tend to come to lottery winners.

[00:35:24] Lucie: It’s the ability to dream where it finds its value then.

[00:35:28] David: Exactly. And I haven’t been involved in these studies, but I have read up on some of them in a different way. And so I don’t know this perfectly, and maybe I’m getting some of this wrong. But it is the sort of things which are important. There’s been some other fantastic research which showed that everybody wants choice.

[00:35:45] Lucie: Yeah, no, no, no, I can’t handle choice.

[00:35:48] David: But actually the evidence shows you’re right. Actually, people are less happy for having had choice.

[00:35:54] Lucie: Yeah.

[00:35:54] David: They have this thing about sort of people getting some art. And the more

[00:35:57] Lucie: It’s because you’re always worried, you’ve got pressure then to find the right one. And then you’re worried that afterwards you’ve chosen the wrong one.

[00:36:03] David: I choose the wrong one, exactly. This is something where actually, they found many weeks later in the study, the people who had had less choice were happier with what they received than the people who had had more choice but they didn’t get what they wanted.

The evidence is there and they’re sort of fantastic things. Incentives for intellectual work, there were some wonderful studies which showed that for intellectual tasks, the higher the financial incentives, the worse people perform the task.

[00:36:35] Lucie: Ok, interesting.

[00:36:36] David: Yet always these are the sort of incentives which tend to come out whether the task is intellectual or not. Now there are other studies which happened before which showed that actually for non intellectual tasks, the higher the financial incentive the better people performed. But it depends on the context.

And this is where we know so much and yet we know so little. This is what I feel is so important, and I’m so grateful for the opportunities I’ve had in education. Because what I learned when I became an expert in my narrow area wasn’t how much I knew, it’s how much there was out there that I didn’t know.

Because I knew so much more than anyone else in the world pretty much about this one narrow area. And yet, I didn’t know, you know, all these other things that I didn’t know that I could know if only I put four or five years into working on this. Time is limited.

[00:37:45] Lucie: Absolutely. And that’s perhaps a good time to end too. It’s been an interesting discussion.

[00:37:51] David: Thank you. I guess we should try and come to some form of at least final statement on evidence which is slightly better than just, you know, the more you know, the more you know, you don’t know. Good statement. I like that. I love that statement. But I do think on evidence, the importance of valuing evidence as societies, as individuals, is there. But I don’t know how to build a society, or how to create societies, which don’t get misled, or don’t misuse evidence. We are, I believe, in an era where the appreciation of evidence, the value of experts, those values are being eroded.

[00:38:38] Lucie: Yeah.

[00:38:39] David: And I think it’s important those trends change in certain ways. But so many experts have betrayed the trust of people because, although they have their academic expertise, they aren’t able to actually answer the questions that they need to answer. They don’t actually know what they need to know to be able to bring evidence to bear in the right way, that they end up bringing just their perspective.

Not an objective view of evidence. And there are, of course, many great scientists who don’t do that. They don’t tend to be the ones that people listen to. And this is another irony, of course.

[00:39:16] Lucie: Yeah.

[00:39:18] David: You know, people want answers. And yet the best scientists I know, they have questions, they don’t have answers.

[00:39:28] Lucie: Definitely.

[00:39:31] David: And that’s a really interesting societal challenge for us to deal with. How do we value that in the right way? Because we need answers.

How do we as society value evidence if we look for answers but the people who know the most know they don’t have the answers, they only have questions? I’m afraid on this particular occasion I don’t feel I’m leaving it with any deep insight for anyone, but I am leaving with an interesting question.

[00:40:05] Lucie: You are. And one which I obviously have no answer to.

[00:40:13] David: Oh. This has been a fun discussion. Thank you.

[00:40:15] Lucie: It has. Yeah. Thank you.