Linear models
fast_automl.linear_model.ConstrainedLinearRegression
class fast_automl.linear_model.ConstrainedLinearRegression(constraint=0, copy_X=True, n_jobs=None) [source]
Linear regression where the coefficients are constrained to sum to a given value.
Parameters: | constraint : scalar, default=0
Sum of the regression coefficients. normalize : bool, default=Falsecopy_Xbool, default=True : If True, X will be copied; else, it may be overwritten. n_jobs : int, default=NoneThe number of jobs to use for the computation. This will only provide speedup for n_targets > 1 and sufficient large problems. None means 1 unless in a joblib.parallel_backend context. -1 means using all processors. |
---|---|
Attributes: | coef_ : array-like of shape (n_features,) or (n_targets, n_features)
Estimated coefficients for the linear regression contrained to sum to the given constraint value. |
Examples
from fast_automl.linear_model import ConstrainedLinearRegression
from sklearn.datasets import load_boston
from sklearn.model_selection import train_test_split
X, y = load_boston(return_X_y=True)
X_train, X_test, y_train, y_test = train_test_split(X, y, shuffle=True)
reg = ConstrainedLinearRegression(constraint=.8).fit(X_train, y_train)
print(reg.score(X_test, y_test))
print(reg.coef_.sum())
Out:
0.6877629260102918
0.8
Methods
fit(self, X, y, sample_weight=None) [source]
Parameters: | X : array-like of shape (n_samples, n_features)
Training data. y : array-like of shape (n_samples, n_targets)Target values. sample_weight, array-like of shape (n_samples,), default=Noone :Individual weights for each sample. |
---|---|
Returns: | self :
|
predict(self, X) [source]
Parameters: | X : array-like, shape (n_samples, n_features)
Samples. |
---|---|
Returns: | C : array, shape (n_samples, n_targets)
Predicted values |
fast_automl.linear_model.Ridge
class fast_automl.linear_model.Ridge(alpha=1.0, prior_weight=0, normalize_coef=False, fit_intercept=True, normalize=False, copy_X=True, max_iter=None, tol= 0.001, solver='auto', random_state=None) [source]
Ridge regression with the option for custom prior weights on coefficients.
Parameters: | prior_weight : array-like of shape (n_features)
Prior weight means. See [scikit-learn's ridge regression documentation](https : //scikit-learn.org/stable/modules/generated/sklearn.linear_model.Ridge.html) for additional parameter details.
|
---|---|
Attributes: | coef_ndarray of shape (n_features,) or (n_targets, n_features) :
Weight vector(s). intercept_float or ndarray of shape (n_targets,) :
Independent term in decision function. Set to 0.0 if |
Examples
from fast_automl.linear_model import Ridge
from sklearn.datasets import load_boston
from sklearn.model_selection import train_test_split
X, y = load_boston(return_X_y=True)
X_train, X_test, y_train, y_test = train_test_split(X, y, shuffle=True)
reg = Ridge().fit(X_train, y_train)
reg.score(X_test, y_test)
Methods
fit(self, X, y, sample_weight=None) [source]
Parameters: | X : array-like of shape (n_samples, n_features)
Training data. y : array-like of shape (n_samples, n_targets)Target values. sample_weight, array-like of shape (n_samples,), default=Noone :Individual weights for each sample. |
---|---|
Returns: | self :
|
predict(self, X) [source]
Parameters: | X : array-like, shape (n_samples, n_features)
Samples. |
---|---|
Returns: | C : array, shape (n_samples, n_targets)
Predicted values |