葡萄酒分类问题:数据集中葡萄酒三个种类,13种特征

1. Title of Database: Wine recognition dataUpdated Sept 21, 1998 by C.Blake : Added attribute information2. Sources:(a) Forina, M. et al, PARVUS - An Extendible Package for DataExploration, Classification and Correlation. Institute of Pharmaceuticaland Food Analysis and Technologies, Via Brigata Salerno, 16147 Genoa, Italy.(b) Stefan Aeberhard, email: stefan@coral.cs.jcu.edu.au(c) July 1991
3. Past Usage:(1)S. Aeberhard, D. Coomans and O. de Vel,Comparison of Classifiers in High Dimensional Settings,Tech. Rep. no. 92-02, (1992), Dept. of Computer Science and Dept. ofMathematics and Statistics, James Cook University of North Queensland.(Also submitted to Technometrics).The data was used with many others for comparing various classifiers. The classes are separable, though only RDA has achieved 100% correct classification.(RDA : 100%, QDA 99.4%, LDA 98.9%, 1NN 96.1% (z-transformed data))(All results using the leave-one-out technique)In a classification context, this is a well posed problem with "well behaved" class structures. A good data set for first testing of a new classifier, but not very challenging.(2) S. Aeberhard, D. Coomans and O. de Vel,"THE CLASSIFICATION PERFORMANCE OF RDA"Tech. Rep. no. 92-01, (1992), Dept. of Computer Science and Dept. ofMathematics and Statistics, James Cook University of North Queensland.(Also submitted to Journal of Chemometrics).Here, the data was used to illustrate the superior performance ofthe use of a new appreciation function with RDA. 4. Relevant Information:-- These data are the results of a chemical analysis ofwines grown in the same region in Italy but derived from threedifferent cultivars.The analysis determined the quantities of 13 constituentsfound in each of the three types of wines. -- I think that the initial data set had around 30 variables, but for some reason I only have the 13 dimensional version. I had a list of what the 30 or so variables were, but a.) I lost it, and b.), I would not know which 13 variablesare included in the set.-- The attributes are (dontated by Riccardo Leardi, riclea@anchem.unige.it )1) Alcohol2) Malic acid3) Ash4) Alcalinity of ash  5) Magnesium6) Total phenols7) Flavanoids8) Nonflavanoid phenols9) Proanthocyanins10)Color intensity11)Hue12)OD280/OD315 of diluted wines13)Proline            5. Number of Instancesclass 1 59class 2 71class 3 486. Number of Attributes 137. For Each Attribute:All attributes are continuousNo statistics available, but suggest to standardisevariables for certain uses (e.g. for us with classifierswhich are NOT scale invariant)NOTE: 1st attribute is class identifier (1-3)8. Missing Attribute Values:None9. Class Distribution: number of instances per classclass 1 59class 2 71class 3 48

相关csv数据下载:
https://download.csdn.net/download/ayangann915/12278359

import numpy as np
from sklearn.model_selection import train_test_split
from sklearn.preprocessing import StandardScaler
from sklearn.neural_network import MLPClassifier
from sklearn.metrics import classification_report,confusion_matrix#读数据
wine = np.genfromtxt("wine_data.csv", delimiter=",")
#取所有的行,和第一列之后的数据,因为第一列是标签,后面的是特征
X = wine[:,1:]
#标签第一列
y = wine[:,0]#划分训练集和测试集,默认四分之一测试集,四分之三训练集
x_train, x_test, y_train, y_test = train_test_split(X, y, test_size=0.3)
#标准化数据
scaler = StandardScaler()
x_train = scaler.fit_transform(x_train)
x_test = scaler.fit_transform(x_test)
#构建神经网络,其中三个隐藏层,分别有100,50,20个神经元,最大训练次数500
mlp = MLPClassifier(hidden_layer_sizes=(100,50,20),max_iter=500)
#训练
mlp.fit(x_train, y_train)
#预测
predict = mlp.predict(x_test)
#打印测试结果和真实标签的准确率
print(classification_report(predict, y_test))"""precision    recall  f1-score   support1.0       0.94      1.00      0.97        162.0       1.00      0.95      0.97        193.0       1.00      1.00      1.00        19accuracy                           0.98        54macro avg       0.98      0.98      0.98        54
weighted avg       0.98      0.98      0.98        54"""

搭建BP神经网络(准确率较低)

from sklearn.model_selection import train_test_split
from sklearn.preprocessing import LabelBinarizer, StandardScaler
from sklearn.metrics import confusion_matrix, classification_report
import numpy as npwine = np.genfromtxt("wine_data.csv",delimiter=",")
X = wine[:, 1:]
y = wine[:, 0]
x_train, x_test, y_train, y_test = train_test_split(X, y)
# scale = StandardScaler()
# x_train = scale.fit_transform(x_train)
# x_test = scale.fit_transform(x_test)
rate = 0.11
#权值,13输入,3输出中间隐藏层100个神经元
v = np.random.random((13,100))*2-1
w = np.random.random((100,3))*2-1label_train = LabelBinarizer().fit_transform(y_train)#定义激活函数及其导数
def sigmoid(x):return 1/(1+np.exp(-x))
def dsigmoid(x):return x*(1-x)#训练
def train(X, y, step = 10000,rate=0.11):global v,wfor n in range(step+1):i = np.random.randint(X.shape[0])x = X[i]x = np.atleast_2d(x)L1 = sigmoid(np.dot(x,v))L2 = sigmoid(np.dot(L1,w))L2_delta = (y[i]-L2)*dsigmoid(L2)L1_delta = L2_delta.dot(w.T)*dsigmoid(L1)w_ = rate * L1.T.dot(L2_delta)v_ = rate * x.T.dot(L1_delta)w = w + w_v = v + v_if n%1000 == 0 :output = predict(x_test)predictions = np.argmax(output, axis=1)acc = np.mean(np.equal(predictions, y_test))print("step:",n,"acc:",acc)def predict(x):L1 = sigmoid(np.dot(x,v))L2 = sigmoid(np.dot(L1,w))return L2train(x_train,label_train,30000)

sklearn神经网络/BP神经网络实现葡萄酒分类问题相关推荐

  1. mlp神经网络和bp神经网络,bp神经网络lm算法原理

    MATLAB中训练LM算法的BP神经网络 1.初始权值不一样,如果一样,每次训练结果是相同的 2.是 3.在train之前修改权值,IW,LW,b,使之相同 4.取多次实验的均值 一点浅见,仅供参考 ...

  2. 神经网络 | BP神经网络-数字识别(附源代码)

    ===================================================== github:https://github.com/MichaelBeechan CSDN: ...

  3. bp神经网络和cnn神经网络,bp神经网络和cnn

    什么是BP神经网络? . BP算法的基本思想是:学习过程由信号正向传播与误差的反向回传两个部分组成:正向传播时,输入样本从输入层传入,经各隐层依次逐层处理,传向输出层,若输出层输出与期望不符,则将误差 ...

  4. 神经网络 | BP神经网络介绍(附源代码:BP神经网络-异或问题)

    ===================================================== github:https://github.com/MichaelBeechan CSDN: ...

  5. bp神经网络和cnn神经网络,bp神经网络与cnn区别

    深度学习与神经网络有什么区别 深度学习与神经网络关系2017-01-10最近开始学习深度学习,基本上都是zouxy09博主的文章,写的蛮好,很全面,也会根据自己的思路,做下删减,细化. 五.Deep ...

  6. bp神经网络和cnn神经网络,bp神经网络和神经网络

    bp算法在深度神经网络上为什么行不通 BP算法作为传统训练多层网络的典型算法,实际上对仅含几层网络,该训练方法就已经很不理想,不再往下进行计算了,所以不适合深度神经网络. BP算法存在的问题:(1)梯 ...

  7. 神经网络 - BP神经网络与RBF神经网络模型解决实际问题 - (Matlab建模)

    目录 神经网络模型简述 实例:交通运输能力预测设计 MATLAB程序及仿真结果 由于货物运输.地方经济及企业发展的紧密联系,因此作为反映货物运输需求的一项重要指标, 货运量预测研究和分析具有较强的实际 ...

  8. 一文搞定bp神经网络,bp神经网络的实现

    1.自学bp神经网络要有什么基础?? 简介:BP(Back Propagation)网络是1986年由Rumelhart和McCelland为首的科学家小组提出,是一种按误差逆传播算法训练的多层前馈网 ...

  9. bp网络和卷积神经网络,bp神经网络

    与传统bp神经网络相比,极限学习机有哪些优点 极限学习机(ELM)算法,随机产生输入层与隐含层间的连接权值及隐含层神经元的阈值,且在训练过程中无需调整,只需设置隐含层神经元的个数,便可获得唯一的最优解 ...

  10. bp神经网络和神经网络,bp神经网络是什么网络

    神经网络BP模型 一.BP模型概述误差逆传播(Error Back-Propagation)神经网络模型简称为BP(Back-Propagation)网络模型. Pall Werbas博士于1974年 ...

最新文章

  1. docker删除镜像命令_第三章 Docker常用命令之镜像命令
  2. epoll socket 服务端中read和write的返回值讨论
  3. 海量数据寻找最频繁的数据_寻找数据科学家的“原因”
  4. 跳出内层循环 使用 for of 代替 map
  5. 字典超详细--python
  6. dns服务器v6解析 windows_04:缓存DNS、Split分离解析、电子邮件通信、Web服务器项目实战...
  7. centos7.0 安装java1.8,tomcat
  8. struts2、hibernate工作原理和流程
  9. Cognos知识点总结
  10. 企业微信机器人WorkTool使用文档
  11. 产品经理需要NPDP证书吗?
  12. chrome浏览器小恐龙自动跑
  13. emmc/ufs, sd
  14. 618,拼多多玩起流量没阿里、京东啥事了
  15. 林亦杉厦门大学计算机学院,挥别厦大,点燃未来-厦门大学计算机科学系
  16. Java之对象转型(casting)
  17. uva1203 - Argu
  18. 【js练习】移动盒子位置,鼠标在盒子外问题
  19. Mac 安装Photoshop遇到一系列问题解决方法
  20. k8s集群安装过程中的相关问题和解决

热门文章

  1. PS带框的对号怎么打
  2. 程序员转型之程序员这个职业到底怎么样?
  3. css3动画: 3d照片旋转transfrom
  4. 运算放大器节点电压方程_区分运算放大器和电压比较器
  5. 2020年度中职组“网络空间安全”赛项xx市竞赛任务书
  6. GPRS PDP APN
  7. 移动电视一直显示Android,移动机顶盒恢复出厂设置后显示android正在升级?
  8. 计算机二级答题技巧口诀,计算机二级考试复习技巧:考场答题经验分享
  9. 三星手机tf卡数据恢复怎么做
  10. WannaRen勒索软件解密密码计算工具发布