Transcript for:
Understanding Ethics in Research Practices

Hello everyone, and welcome to the third part of our series on culture and ethics. Today we're going to be talking about the ethics part of this. We spent some time talking about the intersection of culture and politics and science, and we've learned that there's influences both directions, both good and bad, that influence not only what science has performed, but what theories emerge, who are proponents of those theories, who can do the science, and...

what are the implications when these two things intersect with each other. So today what we're going to focus on is how do we treat our participants who are sources of data, particularly in psychology. And we're going to be speaking mainly about the fair treatment of those participants. And really all this boils down to is as a researcher, you don't want to do anything to other people that you wouldn't want to have done to yourself.

So same considerations you would like, pass those considerations on. to the people that are working or participating in your study as we go through. So the first of these kind of principles is this idea of giving people the ability to choose.

This is absent in a lot of those earlier studies, but the voluntariness, the idea that you have to give consent to participate in the study is an important thing, and it wasn't always built into our research, but it is now one of the core foundations. This contrasts this with coercion, which is this idea that Although you say something's voluntary, the perceived benefits or the actual compensation is so great to the participants that they feel like they can't pass it up. So this was the case in the Tuskegee study where the participants thought they were getting top-notch medical care.

They were getting meals. They were getting transport. And so at the time when all these things were scarce, this is something you couldn't turn down. Or level of compensation. If you give a certain amount.

of compensation for participating. In some circles, it's not going to seem like a lot. In other circles, it could be the difference between eating and not eating.

So that same amount can have very different impacts. So you have to be aware of that and account for that and adjust your study or any compensation to make sure that people always have that choice, that not doing the study does not cause them harm. And doing a study does not cause them harm.

We have to work that both ways. We also have to be careful about how much data and how invasive we are with people. So how much data do we really need?

This is the big debate now with our mobile devices because so much information is carried on these things, whether we realize it or not. At the very least, our position, where we are, our likes, our dislikes. There's algorithms that track the kind of things that you're...

clicking on and then modulate your experience based on that. We're looking at this in terms of health care, in terms of looking at behavioral patterns, and can we do that? And there are positive things that can emerge from this, but there's also a lot of invasiveness that comes with this.

Do we really want people to have access to us all the time? So this is the issue of getting just the data we need for our study and nothing more. And this was one of the big issues in medical research.

where often they're getting more than what was actually necessary from people. The next thing is deceit, which again, if you remember the cartoon we started with, this was one of the big issues that everyone thinks we're trying to trick them. And so there are times when we do have to deceive.

We can't tell you what the actual purpose of the study was or why you were doing a particular task. But after the study, we should... make all efforts to do so.

And this is, again, one of the downfalls of the Milgram study, where they didn't do that. And the little Albert study, there was no way to do that, because there was an infant. There's no way to explain that, how we go through.

Some of the research looking at race is highly emotionally charged, and we have to debrief to make sure people leave feeling okay as we go through, which leads to our next thing, which is the idea of inducement of uncomfortable states. Being ethical does not mean that we're not going to stress people out or that we can't cause any harm. It basically means that anything we do that makes people uncomfortable has to be balanced out and exceeded by the potential benefit or real benefits that come about that. So with me, my work is in motion sickness.

And what we would do is expose people to visual stimuli that mimic motion and recorded their motion and see what would happen. as they go through. Now often what happens is people would get ill.

And what we had to build in our study is that we didn't need people to get deafly ill. We don't need you throwing up. We don't need you passing out because that was all known. That's all documented.

What we were interested in is what happens up until that point. So we had to make it very clear in our research that when you start feeling off, it's time to stop from that. And again, you do have the choice to stop at any time for any reason or no reason. That's the key thing I want to talk about. This is...

Doubly important when we work with animal research, because animals can't give consent, that we can't debrief animals. So the rules and regulations for working with animals are much more stringent than they are for working with people, because they don't have that autonomy. The next thing is, in terms of now that we have data, how do we handle it?

So a couple things that we need to be aware of are things like plagiarism and the falsification of data. plagiarism is taking somebody else's ideas your own false-finding data is just what it sounds like where you're creating information that didn't exist or didn't exist in the form that you received it both of these are deadly to science because science is supposedly self-correcting it is based on trust that you as a researcher are doing the things that you're supposed to do again this goes back to those seven things that assumptions that we can test back earlier. These things really can mess with that.

And so we take them seriously. And any of these are grounds for retraction of your work at the least or dismissal from your position at the greatest for this. So these are serious things to consider that you want to make sure that you're giving credit to where credit is due.

The intellectual capital is what we live on. It's what we have. The data has to be accurate because other people are going to build off that. And if your data is false, then what they're building on will also be false.

So it has a cascade effect that we have to be aware of. So what should we do? As researchers, we should know what our obligations are. We should respect those obligations.

So not just knowing them and throwing them over our shoulder, but actually adhering to them. And because of that... this will allow us to carry out research in an ethical manner. Now what does that mean specifically?

And this is what we're going to talk about in the next slide. The next slide here just talks about the big three principles from the Belmont Principles. This is a report that was generated in the late 70s to really guide how we should interact with our participants. And it revolves around three general categories, respect, beneficence, and justice.

These things form the core set of principles that every research project involving people has to adhere to. So respect, we've talked a little bit about this, people have to be volunteers or know they're volunteers and they have to know what they're getting into. They don't have to know why you're doing it, but they have to know what's expected, they have to know what's the likely time frame, they have to know what are the benefits and risks and allow them to make a decision about whether they want to participate based on that. Beneficence is the idea that we're trying to do good, that what we're trying to do is gain knowledge that will be helpful in some form.

And that, again, it's not that there can't be any risk, it's just that the benefits have to outweigh those risks. And the benefits can be societal as opposed to individual. And then justice is our third one, is just basically make sure you treat people as people. So that's properly compensating people and that you're not wasting your time. You want to make sure that the research that you're doing and the data you're collecting is going to yield some valuable, and again, valuable is up for debate what that means, outcomes, that there's some tangible benefit to what you've done.

And that's why sometimes even if a study doesn't have any risk, if it doesn't have any perceived benefit, then it's likely to get rejected anyway. because at the very least you're wasting people's time if there's not going to be anything that comes out of that. So these are the key things that we need to keep in mind whenever we're doing research.

Every university has an ethics board that's going to check your research to make sure that you're adhering to these principles. These things are reviewed often, depending on your institution, at least yearly, to make sure you're adhering to that. And also to have government funding for research, you have to adhere to these principles as well. Alright, so those are the main three. There's some other information on the Canvas site that if you want to get into this a little bit deeper, but these are the key things that are going to keep us above board with doing science.

So what I'm going to leave you with is just this idea of what is kind of the opposite of this. The things that kind of are science but really aren't science, and this is the idea of pseudoscience. And pseudoscience... is this idea that it uses the trappings or the language or the format of science without really being science.

And what it will try to do is actually associate with a known phenomenon or use particular phrasing to make it sound good. It tends to rely on anecdotal evidence. So one way to keep track of this is that the evidence is testimonial or I knew a person or I've seen this happen.

but not really anything that's data-based. When you see that, that's when you know you might be dealing with pseudoscience. The other thing is that often these testimonials involve effort justification. So if it was hard or you went through this, you're more likely to say that it had an effect even if it didn't because you put through the effort or you made the public declaration. So any of those things will make you kind of say, yeah, it really did have an effect even if it didn't.

And this was some famous studies by Fessinger. that sort of highlight this effect called the cognitive dissonance. Often what you see with pseudoscience is there's no way to falsify it, that no matter what data you came up with, they have an explanation about how that data actually fits their theory. So things that can't be falsified tend to be pseudoscience as well. The other thing you tend to see is it relies on confirmation bias, that it looks only for information that supports it and not for anything that...

could discredit it. So the reporting tends to be biased. And also we talked about science wanting to be parsimonious. These tend to be overly parsimonious. Like they'll take either vague explanations or they'll reduce it to a soundbite.

And parsimony does not mean simple or short. It means straightforward. So when you see pseudoscience, they take it literally and go for, it works because it just mixed the fat and melts away.

And that's their explanation for that. So these are kind of things that will tip you off that you're not really looking at actual science. A couple examples.

One is phrenology. And this was the study of thought that muscle attributes could be discovered and quantified by measuring the skull surface. And it used rigor in language of science. It had a scientific.

a kit where it had very precise measures, but it was kind of like astrology in that it really didn't have any real basis for anything. But it remained popular, particularly as a parlor trick, because you could buy this kit and measure things out and go, oh, you must be all of this. And part of this is because you know the people you're with.

And they go, oh, yeah, that's right. I do have that trait. But it wasn't a real thing as we go through.

Another case of this that happened. Borrowed from actual psychology is subliminal advertising, the idea that you can put things in media messages that trigger behaviors without the person knowing it. This emerged from a set of studies called priming studies that happen in social and cognitive, where you could show that you could influence people's responses by presenting stimuli very briefly.

And they kind of took this to extreme. and said that now you can influence people to do very complex behaviors by doing the same thing. There is no evidence for this, although people have seemingly continued to do it.

And one of the things, there's a researcher that was doing this work, Dr. Wilson Key, who wrote several books, one of them is called Some Models of Abduction, you'll find this in the library. And he looked at sort of the advertising as this place where they're always trying to do this. So, for example...

This picture here for Jen is supposed to, and this is very Freudian, is supposed to trigger your sex drive and therefore make you want to buy this. And there's several things in this picture that are subliminal that supposedly will trigger that response. One of the obvious ones, if you will, is that the ice cube spelled the word sex, and I'll let you guys look for that.

There's another one here, and there's five more. things in this picture. So if you want to take the time to look through that, you can. But it's the idea that it takes this very simple behavior and now makes... it seemed like you can create this complex set of events from a very simple mechanism.

So, again, that just gives you an idea about how this can go wrong and what we should do to make things right. So hopefully you found this useful. We will see you next round.

Take care, everyone.