Type: | Package |
Title: | Better Statistics for OLS and Binomial Logistic Regression |
Version: | 0.2.0 |
Author: | Chris Aberson |
Maintainer: | Chris Aberson <cla18@humboldt.edu> |
Description: | Provides squared semi partial correlations, tolerance, Mahalanobis, Likelihood Ratio Chi Square, and Pseudo R Square. Aberson, C. L. (2022) <doi:10.31234/osf.io/s2yqn>. |
License: | GNU General Public License version 3 |
Encoding: | UTF-8 |
LazyData: | true |
Imports: | car (≥ 3.0-0), stats (≥ 3.5.0), dplyr (≥ 0.8.0) |
Depends: | R (≥ 3.5.0) |
RoxygenNote: | 7.1.2 |
NeedsCompilation: | no |
Packaged: | 2022-03-29 23:41:52 UTC; cla18 |
Repository: | CRAN |
Date/Publication: | 2022-03-30 17:00:01 UTC |
Compute Likelihood Ratio Chi-square for Binomial Logistic Regression with up to 10 predictors
Description
Compute Likelihood Ratio Chi-square for Binomial Logistic Regression with up to 10 predictors
Usage
LRchi(
data = NULL,
y = NULL,
x1 = NULL,
x2 = NULL,
x3 = NULL,
x4 = NULL,
x5 = NULL,
x6 = NULL,
x7 = NULL,
x8 = NULL,
x9 = NULL,
x10 = NULL,
numpred = NULL
)
Arguments
data |
name of your datafile, loaded |
y |
dependent variable name |
x1 |
first predictor variable name |
x2 |
second predictor variable name |
x3 |
third predictor variable name |
x4 |
fourth predictor variable name |
x5 |
fifth predictor variable name |
x6 |
sixth predictor variable name |
x7 |
seventh predictor variable name |
x8 |
eighth predictor variable name |
x9 |
ninth predictor variable name |
x10 |
tenth predictor variable name |
numpred |
number of predictors |
Examples
LRchi(data=testlog, y="dv", x1="iv1", x2="iv2",numpred=2)
Compute Mahalanobis Distance for Multiple Regression
Description
Compute Mahalanobis Distance for Multiple Regression
Usage
Mahal(model = NULL, pred = NULL, values = 5)
Arguments
model |
name of model |
pred |
number of predictors |
values |
number of Mahal values to print (highest values). Default is 10 |
Value
Mahalanobis Distance to detect MV outliers
Examples
mymodel<-lm(y~x1+x2+x3+x4, testreg)
Mahal(model=mymodel, pred=5, values = 10)
R-square change for Hierarchical Multiple Regression
Description
R-square change for Hierarchical Multiple Regression
Usage
R2change(model1 = NULL, model2 = NULL)
Arguments
model1 |
first regression model |
model2 |
second regression model |
Examples
mymodel1<-lm(y~x1+x2, data=testreg)
mymodel2<-lm(y~x1+x2+x3+x4, data=testreg)
R2change(model1=mymodel1, model2=mymodel2)
Power for Comparing Dependent Coefficients in Multiple Regression with Two or Three Predictors Requires correlations between all variables as sample size. Means, sds, and alpha are option. Also computes Power(All)
Description
Power for Comparing Dependent Coefficients in Multiple Regression with Two or Three Predictors Requires correlations between all variables as sample size. Means, sds, and alpha are option. Also computes Power(All)
Usage
depbcomp(
data = NULL,
y = NULL,
x1 = NULL,
x2 = NULL,
x3 = NULL,
x4 = NULL,
x5 = NULL,
numpred = NULL,
comps = "abs"
)
Arguments
data |
name of data file |
y |
dependent variable name |
x1 |
first predictor variable name |
x2 |
second predictor variable name |
x3 |
third predictor variable name |
x4 |
fourth predictor variable name |
x5 |
fifth predictor variable name |
numpred |
number of predictors |
comps |
Type of comparison, "abs" for absolute values or "raw" for raw coefficients |
Value
Comparing Dependent Coefficients in Multiple Regression
Examples
depbcomp(data=testreg,y=y,x1=x1,x2=x2,x3=x3,x4=x4,x5=x5, numpred=5,comps="abs")
Comparing Independent Coefficients in Multiple Regression
Description
Comparing Independent Coefficients in Multiple Regression
Usage
indbcomp(model1 = NULL, model2 = NULL, comps = "abs")
Arguments
model1 |
Summary of first model (see example for how to summarize) |
model2 |
Summary of second model (see example for how to summarize) |
comps |
Type of comparison. "abs" - absolute value of coefficient (recommended). "raw" raw values of coefficient |
Value
Comparing Independent Coefficients in Multiple Regression
Examples
y_1<-rnorm(200); x1_1<-rnorm(200); x2_1<-rnorm(200)
y_2<-rnorm(200); x1_2<-rnorm(200);x2_2<-rnorm(200)
df1<-as.data.frame(cbind(y_1, x1_1,x2_1))
df2<-as.data.frame(cbind(y_2, x1_2,x2_2))
model1_2<-summary(lm(y_1~x1_1+x2_1, data=df1))
model2_2<-summary(lm(y_2~x1_2+x2_2, data=df2))
indbcomp(model1 = model1_2, model2 = model2_2, comps="abs")
Compute squared semi partial correlations for Multiple Regression
Description
Compute squared semi partial correlations for Multiple Regression
Usage
parts(model = NULL, pred = NULL)
Arguments
model |
name of model |
pred |
number of predictors |
Value
Squared semipartial correlations for MRC with up to 10 predictors
Examples
mymodel<-lm(y~x1+x2+x3+x4+x5, data=testreg)
parts(model=mymodel, pred=5)
Pseudo R-square Values for Binomial Logistic Regression
Description
Pseudo R-square Values for Binomial Logistic Regression
Usage
pseudo(model = NULL)
Arguments
model |
name of model |
Value
Pseudo R-square Values for Logistic Regression
Examples
mymodel<-glm(dv~iv1+iv2+iv3+iv4, testlog,family = binomial())
pseudo(model=mymodel)
testlog
Description
A dataset to test logistic regression functions
Usage
testlog
Format
A data frame with 164 rows and 11 variables:
- dv
DV
- iv1
1st predictor
- iv2
2nd predictor
- iv3
3rd predictor
- iv4
4th predictor
- iv5
5th predictor
- iv6
6th predictor
- iv7
7th predictor
- iv8
8th predictor
- iv9
9th predictor
- iv10
10th predictor
testreg
Description
A dataset to test regression functions
Usage
testreg
Format
A data frame with 1000 rows and 6 variables:
- y
DV
- x1
1st predictor
- x2
2nd predictor
- x3
3rd predictor
- x4
4th predictor
- x5
5th predictor
Compute tolerance for Multiple Regression
Description
Compute tolerance for Multiple Regression
Usage
tolerance(model = NULL)
Arguments
model |
name of model |
Value
Tolerance for MR
Examples
mymodel<-lm(y~x1+x2+x3+x4+x5, data=testreg)
tolerance(model=mymodel)