模型介绍
使用普通随机森林(Ramdom Forests)和提升树模型的回归器,以及极端随机森林(Extremely Randomized Trees)。与普通的随机森林模型不同的是,极端随机森林在每当构建一棵树的分裂节点的时候,不会任意地选取特征;而是先随机收集一部分特征,然后利用信息熵(Information Gain)和基尼不纯性(Gini Impurity)等指标挑选最佳的节点特征。
代码
from sklearn.datasets import load_boston from sklearn.model_selection import train_test_split import numpy as np from sklearn.preprocessing import StandardScalerfrom sklearn.metrics import mean_squared_error, mean_absolute_error from sklearn.ensemble import RandomForestRegressor, ExtraTreesRegressor, GradientBoostingRegressorboston = load_boston()# print(boston.DESCR)X = boston.data y = boston.targetX_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.25, random_state=33)#分析回归目标值的差异print("The max target value is ", np.max(boston.target)) print("The min target value is ", np.min(boston.target)) print("The average value is ", np.mean(boston.target))#分别初始化对特征和目标值的标准化器ss_X = StandardScaler() ss_y = StandardScaler()#标准化处理X_train = ss_X.fit_transform(X_train)X_test = ss_X.transform(X_test) y_train = np.array(y_train).reshape(-1, 1) y_test = y_test.reshape(-1, 1) y_train = ss_y.fit_transform(y_train) y_test = ss_y.transform(y_test)#对美国波士顿房价数据进行回归预测rfr = RandomForestRegressor() rfr.fit(X_train, y_train) rfr_y_predict = rfr.predict(X_test) etr = ExtraTreesRegressor() etr.fit(X_train, y_train) etr_y_predict = etr.predict(X_test) gbr = GradientBoostingRegressor() gbr.fit(X_train, y_train) gbr_y_predict = gbr.predict(X_test)#进行预测性能的评估#随机回归森林在测试集上进行性能评估print('R-squared value of RandomForestRegressor:', rfr.score(X_test, y_test)) print('The mean squared error of RandomForestRegressor:', mean_squared_error(ss_y.inverse_transform(rfr_y_predict), ss_y.inverse_transform(y_test))) print('The absolute error of RandomForestRegressor:', mean_absolute_error(ss_y.inverse_transform(rfr_y_predict),ss_y.inverse_transform(y_test)))#out[]:# R-squared value of RandomForestRegressor: 0.802527175854187# The mean squared error of RandomForestRegressor: 15.31229842519685# The absolute error of RandomForestRegressor: 2.486692913385826#默认配置的极端回归森林在测试集上进行性能评估print('R-squared value of ExtraTreesRegressor:', etr.score(X_test, y_test)) print('The mean squared error of ExtraTreesRegressor:', mean_squared_error(ss_y.inverse_transform(etr_y_predict), ss_y.inverse_transform(y_test))) print('The absolute error of ExtraTreesRegressor:', mean_absolute_error(ss_y.inverse_transform(etr_y_predict),ss_y.inverse_transform(y_test)))#利用训练好的极端回归森林模型,输出每种特征对预测目标的贡献度print(np.sort(list(zip(etr.feature_importances_, boston.feature_names)), axis=0))#zip 方法在 Python 2 和 Python 3 中的不同:在 Python 3.x 中为了减少内存,zip() 返回的是一个对象。如需展示列表,需手动 list() 转换。#out[]:# R-squared value of ExtraTreesRegressor: 0.8146427218146937# The mean squared error of ExtraTreesRegressor: 14.372843307086619# The absolute error of ExtraTreesRegressor: 2.4622834645669283# [['0.0034630894158620153' 'AGE']# ['0.0045550801016886' 'B']# ['0.017170090920678872' 'CHAS']# ['0.019637840054376292' 'CRIM']# ['0.023181227692063468' 'DIS']# ['0.024132181629465985' 'INDUS']# ['0.02619849089968168' 'LSTAT']# ['0.03682330189027084' 'NOX']# ['0.05075057773410293' 'PTRATIO']# ['0.05197045680058604' 'RAD']# ['0.08253058832660275' 'RM']# ['0.26698764587049606' 'TAX']# ['0.3925994286641245' 'ZN']]#对默认配置的梯度提升回归树在测试集上进行性能评估print('R-squared value of GradientBoostingRegressor:', gbr.score(X_test, y_test)) print('The mean squared error of GradientBoostingRegressor:', mean_squared_error(ss_y.inverse_transform(gbr_y_predict), ss_y.inverse_transform(y_test))) print('The absolute error of GradientBoostingRegressor:', mean_absolute_error(ss_y.inverse_transform(gbr_y_predict),ss_y.inverse_transform(y_test)))#out[]:# R-squared value of GradientBoostingRegressor: 0.8471412103647435# The mean squared error of GradientBoostingRegressor: 11.852868433588265# The absolute error of GradientBoostingRegressor: 2.273672422147374
特点分析
许多在业界从事商业分析系统开发和搭建的工作者更加青睐集成模型,并且经常以这些模型的性能表现为基准,与新设计的其他模型性能进行比对。虽然这些集成模型在训练过程中要耗费更多的时间,但是往往可以提供更高的表现性能和更好的稳定性。
原文出处:https://blog.csdn.net/qq_38195197/article/details/81235236
点击查看更多内容
为 TA 点赞
评论
共同学习,写下你的评论
评论加载中...
作者其他优质文章
正在加载中
感谢您的支持,我会继续努力的~
扫码打赏,你说多少就多少
赞赏金额会直接到老师账户
支付方式
打开微信扫一扫,即可进行扫码打赏哦