本文共 1576 字,大约阅读时间需要 5 分钟。
一对多
# 训练多元分类器from sklearn.linear_model import LogisticRegressionfrom sklearn import datasetsfrom sklearn.preprocessing import StandardScaler# 加载数据iris = datasets.load_iris()features = iris.datatarget = iris.targetscaler = StandardScaler()features_standardized = scaler.fit_transform(features)# multi_class="ovr" 表示一对多的逻辑回归 另外一种是MLR 多元逻辑回归logistic_regression = LogisticRegression(random_state=0, multi_class="ovr")#logistic_regression_MNL = LogisticRegression(random_state=0, multi_class="multinomial")# 训练模型model = logistic_regression.fit(features_standardized, target)DiscussionOn their own, logistic regressions are only binary classifiers, meaning they cannot handle target vectors with more than two classes. However, two clever extensions to logistic regression do just that. First, in one-vs-rest logistic regression (OVR) a separate model is trained for each class predicted whether an observation is that class or not (thus making it a binary classification problem). It assumes that each observation problem (e.g. class 0 or not) is independentAlternatively in multinomial logistic regression (MLR) the logistic function we saw in Recipe 15.1 is replaced with a softmax function:P(yI=k|X)=eβkxi∑Kj=1eβjxiP(yI=k|X)=eβkxi∑j=1Keβjxi where P(yi=k|X)P(yi=k|X) is the probability of the ith observation's target value, yiyi , is class k, and K is the total number of classes. One practical advantage of the MLR is that its predicted probabilities using predict_proba method are more reliableWe can switch to an MNL by setting multi_class='multinomial'
转载地址:http://xprg.baihongyu.com/