注意
跳转至末尾 下载完整示例代码。或通过 JupyterLite 或 Binder 在浏览器中运行此示例
SVM 决策平局示例#
如果 decision_function_shape='ovr'
,决策平局处理的成本很高,因此默认情况下未启用。此示例演示了在多类分类问题和 decision_function_shape='ovr'
时 break_ties
参数的效果。
这两个图唯一的区别在于中间类别出现平局的区域。如果 break_ties=False
,该区域的所有输入将被分类为同一类别;而如果 break_ties=True
,决策平局处理机制将在该区域创建非凸决策边界。

# Authors: The scikit-learn developers
# SPDX-License-Identifier: BSD-3-Clause
import matplotlib.pyplot as plt
import numpy as np
from sklearn.datasets import make_blobs
from sklearn.svm import SVC
X, y = make_blobs(random_state=27)
fig, sub = plt.subplots(2, 1, figsize=(5, 8))
titles = ("break_ties = False", "break_ties = True")
for break_ties, title, ax in zip((False, True), titles, sub.flatten()):
svm = SVC(
kernel="linear", C=1, break_ties=break_ties, decision_function_shape="ovr"
).fit(X, y)
xlim = [X[:, 0].min(), X[:, 0].max()]
ylim = [X[:, 1].min(), X[:, 1].max()]
xs = np.linspace(xlim[0], xlim[1], 1000)
ys = np.linspace(ylim[0], ylim[1], 1000)
xx, yy = np.meshgrid(xs, ys)
pred = svm.predict(np.c_[xx.ravel(), yy.ravel()])
colors = [plt.cm.Accent(i) for i in [0, 4, 7]]
points = ax.scatter(X[:, 0], X[:, 1], c=y, cmap="Accent")
classes = [(0, 1), (0, 2), (1, 2)]
line = np.linspace(X[:, 1].min() - 5, X[:, 1].max() + 5)
ax.imshow(
-pred.reshape(xx.shape),
cmap="Accent",
alpha=0.2,
extent=(xlim[0], xlim[1], ylim[1], ylim[0]),
)
for coef, intercept, col in zip(svm.coef_, svm.intercept_, classes):
line2 = -(line * coef[1] + intercept) / coef[0]
ax.plot(line2, line, "-", c=colors[col[0]])
ax.plot(line2, line, "--", c=colors[col[1]])
ax.set_xlim(xlim)
ax.set_ylim(ylim)
ax.set_title(title)
ax.set_aspect("equal")
plt.show()
脚本总运行时间: (0 分 1.020 秒)
相关示例