Okay. In the last video, you have seen me try some general trial and error in order to find a transformation that was able to deal with our violated assumptions in a way that was satisfactory. As I mentioned in that video, trial and error is a common part of finding a transformation that's appropriate for your data. In this video though, we're going to briefly discuss some general rules of thumb that can be helpful to help guide you in your choice of transformation. Log transforming your data can be particularly useful when your residuals are right skewed. What that means is that if you took your residuals and plotted them in a histogram, if you saw that there was a tail in that histogram that was sticking out to the right, then that's what we mean by a right skewed distribution. So in that case, log transformation might be or is a good candidate to help you deal with your assumptions of whatever your assumptions are that had been violated. Log transformation also helps when the variance increases with the mean, which is one of the things that we saw in our previous video. I just want to note that the log of 0 does not exist. And so if you have data sets that include a value of 0, then it's fairly common to not just transform the data by taking a log, but to add one to all of your data points. And then that take a log of those, of those new data. So your transformation, it's not just the log, but it's the log of your original data plus 1. And we want to make sure that we add one to all the data in order to make sure that we treat all of our data equally. Our last point here is you can note that there are different types of log transformations that we can use, like log 10, log to the base 10, or log base e, which is the natural log, and they have identical effects. Now we're going to briefly discuss the arc sine square root transformation. It can be useful when analyzing proportions and percentages. Proportions and percentages are naturally bounded by one, by 01. If you have a proportion, the smallest proportion possible 0, and the largest proportion possible is one. Normal distributions are not bounded. Theoretically, they go on to infinity in both directions. Functionally that, that doesn't actually happen with real biological data. But I'm highlighting that this point here to point out that there are good reasons for proportions to not be normally distributed. This transformation involves, first of all, taking the square root of all of your data and then taking the arc sine of those square rooted data. And this transformation can be useful, as I said, to deal with proportions and percentages. And a moment though, I'm going to show you something where I'm going to show you how this transformation can be controversial. The next we have the square root transformation. And the square root transformation is particularly helpful when we have count values that are close to 0 or when we're dealing with Poisson variables. And we're not going to discuss what plus o means here. That's up. You'd be dealt with in later videos. And again, we don't just have to only take the square root of something. We could, we could add a value to all of our data points, like add 0.5 or add three-eighths, and then take the square root of the result of all those sums. The important thing though is to treat all of our data points identically. Otherwise, we will start introducing bias into our analysis which you do not want to do. I'm just going to quickly show you a paper with a beautiful title. So it's this paper here, published it ecology in 2011, I believe. And the paper's called the arc sine is asinine. The analysis of proportions in ecology. The fact that it says in ecology does not matter. This really should just say at the analysis of proportions. And what these authors argue is that there are far more meaningful and potentially powerful ways of analyzing data that our proportions or percentages. So I highly recommend that you check this paper out. The methods they recommend, however, are more advanced than a straightforward general linear model, which is what we're teaching at the current time. Okay? So those are a handful of transformations that you could try and the circumstances in which they're often a good bet to be productive. The real question that faces us is, once you've tried a series of transformations, how do you know whether or not you've got the best transformation for your data? The answer to that is something that I modeled in the previous video. We're first I tried log transforming the data. And then I tried square root transforming the data. And I chose their transformation that made me feel most satisfied that my data meet the assumptions of the test. So will the answer to this question, have I got the best transformation? The answer to that question is to check your residual plots and ask whether or not a particular transformation has helped your data meet, they meet the assumptions of the model. And if you have a variety of transformations to consider, choose the transformation that actually does a good job. If none of the transformations help, then we're going to talk about what to do about that case in a later video coming up soon. I want to note are really emphasize that's the best transformation or the best transformation is not the one that gives you the quote, unquote best p-value. And by that I mean, the p-value that makes you most happiest are most excited. It's really important, which is why I put this in bowls down at the bottom. It's really important you check your assumptions before checking your results, like looking at effect sizes and your p-values. And that's because if you look at your p-values are effect sizes before you decided on a particular transformation, then the results that you see could bias your choice or transformation. Because if one transformation gives you a result that you think is more interesting than another, then you might be more likely to choose that transformation that leads to the more interesting results. Or two results that you think are more interesting. That's not what we want to do. We want to remain objective. And so that's why it's important to decide about its transformation, to check all of your assumptions before ever looking at your results. There are not any data analysis police that are going to come knocking on your door if you don't follow this rule. But it's just a really good habit to keep. And on that note, I hope this video has been helpful to help you guide you and your choices of transformations. Thank you.