【机器学习9】KNN

【机器学习9】KNN,第1张

文章目录
  • 1. KNN算法原理
  • 2. KNN算法特点
  • 3. K值
  • 4. 数据处理
    • 4.1 KDTree
    • 4.2 数据归一化
  • 5. 算法流程
  • 6. Java代码
    • 6.1 训练集,测试集
    • 6.2 KNN.java
    • 6.3 预测

1. KNN算法原理
- "英文:"
		K-Nearest Neighbor

- "思想:"
		对于任意n维输入向量,分别对应于特征空间中的一个点,输出为该特征向量所对应的类别标签或预测值

- "原理:"
		1. 它的工作原理是利用训练数据对特征向量空间进行划分,并将划分结果作为最终算法模型
		2. 输入没有标签的数据后,将这个没有标签的数据的每个特征与样本集中的数据对应的特征进行比较,
		然后提取样本中特征最相近的数据(最近邻)的分类标签
2. KNN算法特点
		1. 最基础、最简单
		2. 我们只选择样本数据集中前k个最相似的数据,这就是KNN算法中K的由来(k<=203. 优点:精度高、对异常值不敏感
		4. 缺点:计算复杂度高、空间复杂度高
		5. 也被称为惰性学习 lazy learning. 类似于开卷考试, 在已有数据中去找答案
		6. 一般需要对数据归一化
3. K值
- "K值选择:"
		1. k值过小,误差较小,但容易导致KNN算法的过拟合
		2. k值过大,鲁棒性,模型容易发生欠拟合
		3. 一般k值选得比较小,我们会在较小范围内选取k值,同时把测试集上准确率最高的那个确定为最终的算法超参数k

- "交叉验证:"
		1. 将样本数据按照一定比例,拆分出训练用的数据和验证用的数据(比如64拆分出部分训练数据和验证数据)
		2. 从选取一个较小的K值开始不断增加K的值
		3. 然后计算验证集合的方差,最终找到一个比较合适的K值。
4. 数据处理 4.1 KDTree
- "目的:"
		为了快速查找到k个近邻,我们可以考虑使用特殊的数据结构存储训练数据,用来减少搜索次数。
		其中,KDTree就是最著名的一种。
4.2 数据归一化
  • 原因
1. 在距离计算中我们需要计算预测样本与训练数据集中多个特征的距离,但不同特征之间数据差异较大,会造成预测错误。
2. 可以类比:欧氏距离与标准化欧式距离
3. 目的:使特征等价
4. 数据归一化,使不同特征属于同一数量级,之间的差异性减弱,有利于KNN的分类
  • 数据归一化

一般的数据归一化是将数据映射在 [0,1] 中,公式如下:

X = X − X m i n X m a x − X m i n X=\frac{X-X_{min}}{X_{max}-X_{min}} X=XmaxXminXXmin

5. 算法流程
		1. 计算待预测样本与训练数据集中样本特征之间的欧式距离。 
		2. 按照距离递增的顺序排序。
		3. 选取距离最近的K个样本以及所属类别的次数。
		4. 返回前k个点所出现频率最高的类别作为预测分类结果。
6. Java代码 6.1 训练集,测试集
  • iris.arff
@RELATION iris

@ATTRIBUTE sepallength	REAL
@ATTRIBUTE sepalwidth 	REAL
@ATTRIBUTE petallength 	REAL
@ATTRIBUTE petalwidth	REAL
@ATTRIBUTE class 	{Iris-setosa,Iris-versicolor,Iris-virginica}

@DATA
5.1,3.5,1.4,0.2,Iris-setosa
4.9,3.0,1.4,0.2,Iris-setosa
4.7,3.2,1.3,0.2,Iris-setosa
4.6,3.1,1.5,0.2,Iris-setosa
5.0,3.6,1.4,0.2,Iris-setosa
5.4,3.9,1.7,0.4,Iris-setosa
4.6,3.4,1.4,0.3,Iris-setosa
5.0,3.4,1.5,0.2,Iris-setosa
4.4,2.9,1.4,0.2,Iris-setosa
4.9,3.1,1.5,0.1,Iris-setosa
5.4,3.7,1.5,0.2,Iris-setosa
4.8,3.4,1.6,0.2,Iris-setosa
4.8,3.0,1.4,0.1,Iris-setosa
4.3,3.0,1.1,0.1,Iris-setosa
5.8,4.0,1.2,0.2,Iris-setosa
5.7,4.4,1.5,0.4,Iris-setosa
5.4,3.9,1.3,0.4,Iris-setosa
5.1,3.5,1.4,0.3,Iris-setosa
5.7,3.8,1.7,0.3,Iris-setosa
5.1,3.8,1.5,0.3,Iris-setosa
5.4,3.4,1.7,0.2,Iris-setosa
5.1,3.7,1.5,0.4,Iris-setosa
4.6,3.6,1.0,0.2,Iris-setosa
5.1,3.3,1.7,0.5,Iris-setosa
4.8,3.4,1.9,0.2,Iris-setosa
5.0,3.0,1.6,0.2,Iris-setosa
5.0,3.4,1.6,0.4,Iris-setosa
5.2,3.5,1.5,0.2,Iris-setosa
5.2,3.4,1.4,0.2,Iris-setosa
4.7,3.2,1.6,0.2,Iris-setosa
4.8,3.1,1.6,0.2,Iris-setosa
5.4,3.4,1.5,0.4,Iris-setosa
5.2,4.1,1.5,0.1,Iris-setosa
5.5,4.2,1.4,0.2,Iris-setosa
4.9,3.1,1.5,0.1,Iris-setosa
5.0,3.2,1.2,0.2,Iris-setosa
5.5,3.5,1.3,0.2,Iris-setosa
4.9,3.1,1.5,0.1,Iris-setosa
4.4,3.0,1.3,0.2,Iris-setosa
5.1,3.4,1.5,0.2,Iris-setosa
5.0,3.5,1.3,0.3,Iris-setosa
4.5,2.3,1.3,0.3,Iris-setosa
4.4,3.2,1.3,0.2,Iris-setosa
5.0,3.5,1.6,0.6,Iris-setosa
5.1,3.8,1.9,0.4,Iris-setosa
4.8,3.0,1.4,0.3,Iris-setosa
5.1,3.8,1.6,0.2,Iris-setosa
4.6,3.2,1.4,0.2,Iris-setosa
5.3,3.7,1.5,0.2,Iris-setosa
5.0,3.3,1.4,0.2,Iris-setosa
7.0,3.2,4.7,1.4,Iris-versicolor
6.4,3.2,4.5,1.5,Iris-versicolor
6.9,3.1,4.9,1.5,Iris-versicolor
5.5,2.3,4.0,1.3,Iris-versicolor
6.5,2.8,4.6,1.5,Iris-versicolor
5.7,2.8,4.5,1.3,Iris-versicolor
6.3,3.3,4.7,1.6,Iris-versicolor
4.9,2.4,3.3,1.0,Iris-versicolor
6.6,2.9,4.6,1.3,Iris-versicolor
5.2,2.7,3.9,1.4,Iris-versicolor
5.0,2.0,3.5,1.0,Iris-versicolor
5.9,3.0,4.2,1.5,Iris-versicolor
6.0,2.2,4.0,1.0,Iris-versicolor
6.1,2.9,4.7,1.4,Iris-versicolor
5.6,2.9,3.6,1.3,Iris-versicolor
6.7,3.1,4.4,1.4,Iris-versicolor
5.6,3.0,4.5,1.5,Iris-versicolor
5.8,2.7,4.1,1.0,Iris-versicolor
6.2,2.2,4.5,1.5,Iris-versicolor
5.6,2.5,3.9,1.1,Iris-versicolor
5.9,3.2,4.8,1.8,Iris-versicolor
6.1,2.8,4.0,1.3,Iris-versicolor
6.3,2.5,4.9,1.5,Iris-versicolor
6.1,2.8,4.7,1.2,Iris-versicolor
6.4,2.9,4.3,1.3,Iris-versicolor
6.6,3.0,4.4,1.4,Iris-versicolor
6.8,2.8,4.8,1.4,Iris-versicolor
6.7,3.0,5.0,1.7,Iris-versicolor
6.0,2.9,4.5,1.5,Iris-versicolor
5.7,2.6,3.5,1.0,Iris-versicolor
5.5,2.4,3.8,1.1,Iris-versicolor
5.5,2.4,3.7,1.0,Iris-versicolor
5.8,2.7,3.9,1.2,Iris-versicolor
6.0,2.7,5.1,1.6,Iris-versicolor
5.4,3.0,4.5,1.5,Iris-versicolor
6.0,3.4,4.5,1.6,Iris-versicolor
6.7,3.1,4.7,1.5,Iris-versicolor
6.3,2.3,4.4,1.3,Iris-versicolor
5.6,3.0,4.1,1.3,Iris-versicolor
5.5,2.5,4.0,1.3,Iris-versicolor
5.5,2.6,4.4,1.2,Iris-versicolor
6.1,3.0,4.6,1.4,Iris-versicolor
5.8,2.6,4.0,1.2,Iris-versicolor
5.0,2.3,3.3,1.0,Iris-versicolor
5.6,2.7,4.2,1.3,Iris-versicolor
5.7,3.0,4.2,1.2,Iris-versicolor
5.7,2.9,4.2,1.3,Iris-versicolor
6.2,2.9,4.3,1.3,Iris-versicolor
5.1,2.5,3.0,1.1,Iris-versicolor
5.7,2.8,4.1,1.3,Iris-versicolor
6.3,3.3,6.0,2.5,Iris-virginica
5.8,2.7,5.1,1.9,Iris-virginica
7.1,3.0,5.9,2.1,Iris-virginica
6.3,2.9,5.6,1.8,Iris-virginica
6.5,3.0,5.8,2.2,Iris-virginica
7.6,3.0,6.6,2.1,Iris-virginica
4.9,2.5,4.5,1.7,Iris-virginica
7.3,2.9,6.3,1.8,Iris-virginica
6.7,2.5,5.8,1.8,Iris-virginica
7.2,3.6,6.1,2.5,Iris-virginica
6.5,3.2,5.1,2.0,Iris-virginica
6.4,2.7,5.3,1.9,Iris-virginica
6.8,3.0,5.5,2.1,Iris-virginica
5.7,2.5,5.0,2.0,Iris-virginica
5.8,2.8,5.1,2.4,Iris-virginica
6.4,3.2,5.3,2.3,Iris-virginica
6.5,3.0,5.5,1.8,Iris-virginica
7.7,3.8,6.7,2.2,Iris-virginica
7.7,2.6,6.9,2.3,Iris-virginica
6.0,2.2,5.0,1.5,Iris-virginica
6.9,3.2,5.7,2.3,Iris-virginica
5.6,2.8,4.9,2.0,Iris-virginica
7.7,2.8,6.7,2.0,Iris-virginica
6.3,2.7,4.9,1.8,Iris-virginica
6.7,3.3,5.7,2.1,Iris-virginica
7.2,3.2,6.0,1.8,Iris-virginica
6.2,2.8,4.8,1.8,Iris-virginica
6.1,3.0,4.9,1.8,Iris-virginica
6.4,2.8,5.6,2.1,Iris-virginica
7.2,3.0,5.8,1.6,Iris-virginica
7.4,2.8,6.1,1.9,Iris-virginica
7.9,3.8,6.4,2.0,Iris-virginica
6.4,2.8,5.6,2.2,Iris-virginica
6.3,2.8,5.1,1.5,Iris-virginica
6.1,2.6,5.6,1.4,Iris-virginica
7.7,3.0,6.1,2.3,Iris-virginica
6.3,3.4,5.6,2.4,Iris-virginica
6.4,3.1,5.5,1.8,Iris-virginica
6.0,3.0,4.8,1.8,Iris-virginica
6.9,3.1,5.4,2.1,Iris-virginica
6.7,3.1,5.6,2.4,Iris-virginica
6.9,3.1,5.1,2.3,Iris-virginica
5.8,2.7,5.1,1.9,Iris-virginica
6.8,3.2,5.9,2.3,Iris-virginica
6.7,3.3,5.7,2.5,Iris-virginica
6.7,3.0,5.2,2.3,Iris-virginica
6.3,2.5,5.0,1.9,Iris-virginica
6.5,3.0,5.2,2.0,Iris-virginica
6.2,3.4,5.4,2.3,Iris-virginica
5.9,3.0,5.1,1.8,Iris-virginica
  • test.arff
@RELATION iris

@ATTRIBUTE sepallength	REAL
@ATTRIBUTE sepalwidth 	REAL
@ATTRIBUTE petallength 	REAL
@ATTRIBUTE petalwidth	REAL
@ATTRIBUTE class 	{Iris-setosa,Iris-versicolor,Iris-virginica}

@DATA
5.1,3.5,1.4,0.2,Iris-setosa
4.9,3.0,1.4,0.2,Iris-setosa
6.2 KNN.java
package supervisedlearning;

import java.io.FileReader;
import java.util.Arrays;
import java.util.HashSet;
import java.util.Iterator;
import java.util.Set;

import weka.core.Instances;

public class KNN {
	Instances trainDataset;
	int k;

	/**
	 * 读取训练集(参照集)
	 * @param trainingFile
	 * @param numNeighbors
	 */
	public KNN(String trainingFile, int numNeighbors) {
		trainDataset = null;
		k = numNeighbors;
		try {
			FileReader fileReader = new FileReader(trainingFile);
			trainDataset = new Instances(fileReader);
			fileReader.close();
		} catch (Exception e) {
			System.out.println("Cannot read the file: " + trainingFile + "\r\n" + e);
			System.exit(0);
		}
	}

	/**
	 * 从样本里面随机选num个样本下标
	 * 
	 * @param length
	 * @param num
	 * @return
	 */
	public int[] getRandomIndices(int length, int num) {
		// 1
		Set<Integer> set = new HashSet<>(num);
		// 2
		while (set.size() != num) {
			for (int i = 0; i < num; i++) {
				set.add((int) (Math.random() * length));
			}
		}
		// 3
		int[] result = new int[num];
		Iterator<Integer> it = set.iterator();
		for (int i = 0; i < num; i++) {
			result[i] = it.next();
		}
		return result;
	}

	/**
	 * 通过下标,来计算两个样本(原型样本---其余样本)的欧式距离
	 * @param testIndex
	 * @param trainIndex
	 * @return
	 */
	public double eudistance(int testIndex, int trainIndex) {
		double result = 0.0;
		for (int i = 0; i < trainDataset.numAttributes() - 1; i++) {
			double temp;
			double p1 = trainDataset.instance(testIndex).value(i);
			double p2 = trainDataset.instance(trainIndex).value(i);
			temp = Math.abs((p1 - p2)) * Math.abs((p1 - p2));
			result += temp;
		}
		return Math.sqrt(result);
	}

	/**
	 * 读取测试集
	 * @param testingFile
	 * @return
	 */
	public Instances readTestingFile(String testingFile) {
		Instances result = null;
		try {
			FileReader fileReader = new FileReader(testingFile);
			result = new Instances(fileReader);
			fileReader.close();
		} catch (Exception e) {
			System.out.println("Cannot read the file: " + testingFile + "\r\n" + e);
			System.exit(0);
		}
		return result;
	}
	
	/**
	 * 测试样本(下标)到训练样本(下标)的欧式距离---->构成了二维矩阵
	 * @param testDataset
	 * @param trainDataset
	 * @return
	 */
	public double[][] getDistances(Instances testDataset, Instances trainDataset) {
		int trainM = trainDataset.numInstances();
		int testM = testDataset.numInstances();
		double[][] distances = new double[testM][trainM];
		for (int i = 0; i < testM; i++) {
			for (int j = 0; j < trainM; j++) {
				distances[i][j] = eudistance(i, j);
			}
		}
		return distances;
	}
	
	public void bubbleSort(double[] arrs) {
		int n = arrs.length;
		for(int i=0; i<n; i++) {
			for(int j=0; j<n-1-i; j++) {
				if(arrs[j] > arrs[j+1]) {
					swap(arrs, j, j+1);
				}
			}
		}
	} // bubbleSort
	
	private void swap(double[] arrs, int i, int j) {
		double temp = arrs[i];
		arrs[i] = arrs[j];
		arrs[j] = temp;
	} // swap
	
	public int[][] getIndexByDistances(double[][] distances) {
		int len1 = distances.length;
		int len2 = distances[0].length;
		int[][] result = new int[len1][len2];
		// 1
		return result;
	}

	
	public void predict(String testingFile) {
		Instances testDataset = readTestingFile(testingFile);
		int trainM = trainDataset.numInstances();
		int testM = testDataset.numInstances();
		// 1
		double[][] distances = new double[testM][trainM];
		distances = getDistances(testDataset, trainDataset);
		System.out.println("测试样本到训练样本的距离:" + Arrays.deepToString(distances));
		int[][] indexOfTrains = getIndexByDistances(distances);
		System.out.println("训练样本:" + Arrays.deepToString(indexOfTrains));
		// 2
		for(int i=0; i< testM; i++) {
			bubbleSort(distances[i]);
		}
		System.out.println("排序后的距离:" + Arrays.deepToString(distances));
		indexOfTrains = getIndexByDistances(distances);
		System.out.println("训练样本:" + Arrays.deepToString(indexOfTrains));
		// 3
		double[][] kDistances = new double[testM][k];
		for (int i = 0; i < testM; i++) {
			for (int j = 0; j < k; j++) {
				kDistances[i][j] = distances[i][j];
			}
		}
		System.out.println(k + "个临近的距离:" + Arrays.deepToString(kDistances));
		int[][] minIndex = getIndexByDistances(kDistances);
		System.out.println("训练样本:" + Arrays.deepToString(minIndex));
		// 4
	}

	public double getAccuracy() {
		double result = 0.0;
		// 1
		return result;
	}

	public static void main(String args[]) {
		KNN knn = new KNN("D:/data/iris.arff", 7);
		knn.predict("D:/data/test.arff");
		System.out.println("准确度:" + knn.getAccuracy()*100 + "%");
	}
	
} // KNN
6.3 预测
测试样本到训练样本的距离:[[0.0, 0.5385164807134502, 0.509901951359278, 0.648074069840786, 0.1414213562373093, 0.6164414002968979, 0.5196152422706632, 0.17320508075688762, 0.9219544457292882, 0.4690415759823426, 0.37416573867739483, 0.3741657386773941, 0.5916079783099616, 0.9949874371066197, 0.8831760866327848, 1.1045361017187267, 0.5477225575051664, 0.09999999999999998, 0.7416198487095667, 0.33166247903553986, 0.4358898943540679, 0.30000000000000016, 0.648074069840786, 0.46904157598234303, 0.5916079783099616, 0.5477225575051662, 0.316227766016838, 0.14142135623730995, 0.14142135623730995, 0.53851648071345, 0.5385164807134504, 0.3872983346207423, 0.6244997998398396, 0.8062257748298554, 0.4690415759823426, 0.37416573867739383, 0.41231056256176635, 0.4690415759823426, 0.866025403784438, 0.14142135623730964, 0.17320508075688743, 1.3490737563232043, 0.7681145747868601, 0.45825756949558394, 0.6164414002968975, 0.5916079783099616, 0.3605551275463989, 0.58309518948453, 0.30000000000000027, 0.22360679774997896, 4.003748243833521, 3.6166282640050254, 4.164132562731403, 3.093541659651604, 3.792097045171708, 3.416138170507745, 3.7854986461495406, 2.345207879911715, 3.749666651850535, 2.8879058156387303, 2.703701166919155, 3.228002478313795, 3.146426544510455, 3.7, 2.5806975801127883, 3.627671429443412, 3.4351128074635335, 3.009983388658482, 3.7682887362833544, 2.882707061079915, 3.8535697735995385, 3.0757112998459397, 4.047221268969613, 3.6578682316343767, 3.416138170507745, 3.59722114972099, 4.047221268969612, 4.244997055358225, 3.531288716601915, 2.4939927826679855, 2.8178005607210745, 2.7018512172212596, 2.8948229652260253, 4.135214625627066, 3.411744421846396, 3.5199431813596087, 3.9115214431215897, 3.6180105030251095, 2.9999999999999996, 3.0215889859476257, 3.3120990323358397, 3.59583091927304, 3.0099833886584824, 2.387467277262665, 3.1527765540868895, 3.07408522978788, 3.1256999216175574, 3.3451457367355464, 2.0904544960366875, 3.057776970284131, 5.2848841046895245, 4.208325082500163, 5.301886456724625, 4.690415759823429, 5.056678751908213, 6.0950799830683104, 3.591656999213594, 5.636488268416782, 5.047771785649585, 5.639148871948673, 4.356604182158392, 4.519955751995809, 4.853864439804639, 4.190465367951393, 4.417012565071555, 4.626013402488151, 4.645427859734774, 6.240192304729079, 6.498461356351979, 4.141255848169732, 5.121523210920752, 4.028647415696738, 6.211280061307815, 4.109744517606904, 4.969909455915672, 5.31224999411737, 3.9774363602702683, 4.007492981902776, 4.840454524112379, 5.0970579749498635, 5.546169849544818, 6.014149981501959, 4.880573736764972, 4.160528812542944, 4.570557952810575, 5.788782255362521, 4.891829923454003, 4.606517122512408, 3.8961519477556315, 4.796873982084583, 5.0199601592044525, 4.636809247747852, 4.208325082500163, 5.2573757712379665, 5.136146415358503, 4.654030511288039, 4.27668095606862, 4.459820624195552, 4.650806381693394, 4.1400483088968905], [0.5385164807134502, 0.0, 0.30000000000000016, 0.3316624790355407, 0.608276253029822, 1.0908712114635715, 0.5099019513592788, 0.42426406871192834, 0.5099019513592785, 0.17320508075688784, 0.8660254037844388, 0.4582575694955841, 0.1414213562373099, 0.6782329983125273, 1.360147050873544, 1.6278820596099708, 1.0535653752852738, 0.5477225575051659, 1.1747340124470729, 0.8366600265340752, 0.7071067811865475, 0.7615773105863909, 0.7810249675906658, 0.5567764362830019, 0.6480740698407861, 0.22360679774997896, 0.4999999999999999, 0.5916079783099616, 0.49999999999999983, 0.3464101615137758, 0.24494897427831822, 0.6782329983125268, 1.1489125293076055, 1.3416407864998738, 0.17320508075688784, 0.3, 0.7874007874011809, 0.17320508075688784, 0.5099019513592784, 0.4582575694955836, 0.529150262212918, 0.8185352771872454, 0.5477225575051662, 0.6782329983125268, 0.9848857801796101, 0.14142135623730986, 0.8485281374238567, 0.3605551275463996, 0.812403840463596, 0.31622776601683766, 4.096339829652808, 3.6864617182333523, 4.236744032862973, 2.9698484809834995, 3.811823710509183, 3.3911649915626345, 3.8600518131237567, 2.1470910553583886, 3.788139384975162, 2.8053520278211073, 2.4617067250182343, 3.2449961479175906, 3.0413812651491097, 3.712142238654117, 2.5592967784139455, 3.7000000000000006, 3.433656942677879, 2.971531591620725, 3.6918829883949464, 2.792848008753788, 3.893584466786357, 3.0740852297878796, 4.018706259482024, 3.6565010597564442, 3.4467375879228173, 3.651027252705737, 4.080441152620633, 4.295346318982906, 3.5383612025908264, 2.4186773244895647, 2.7, 2.5787593916455256, 2.8548204847240393, 4.11703777004778, 3.398529093593286, 3.59722114972099, 3.9786932528155523, 3.55808937493144, 2.9983328701129897, 2.929163703175362, 3.2434549480453714, 3.622154055254966, 2.9546573405388314, 2.179449471770337, 3.10322412983658, 3.0789608636681307, 3.1144823004794877, 3.3645207682521443, 1.9131126469708992, 3.029851481508623, 5.338539126015655, 4.180908992073374, 5.357238094391549, 4.708502946797421, 5.091168824543142, 6.159545437773796, 3.479942528261063, 5.686826883245173, 5.040833264451424, 5.747173218200404, 4.41927595879687, 4.521061822182926, 4.902040391510457, 4.134005321718878, 4.402272140611027, 4.68081189538738, 4.682947789587238, 6.369458375717672, 6.5314623171231725, 4.06201920231798, 5.1903757089443925, 4.0024992192379, 6.2617888817813085, 4.106093033529563, 5.042816673249187, 5.389805191284746, 3.9812058474788765, 4.031128874149275, 4.851803788283281, 5.158488150611572, 5.591958512006325, 6.154673021371647, 4.891829923454003, 4.168932717135165, 4.547526800360829, 5.860034129593443, 4.959838707054897, 4.650806381693394, 3.9153543900903784, 4.860041152089146, 5.072474741188959, 4.702127178203498, 4.180908992073374, 5.320714237769211, 5.206726418777926, 4.699999999999999, 4.249705872175156, 4.498888751680798, 4.718050444834179, 4.153311931459037]]
训练样本:[[0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117, 118, 119, 120, 121, 122, 123, 124, 125, 126, 127, 128, 129, 130, 131, 132, 133, 134, 135, 136, 137, 138, 139, 140, 141, 142, 143, 144, 145, 146, 147, 148, 149], [0, 1, 2, 3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13, 14, 15, 16, 17, 18, 19, 20, 21, 22, 23, 24, 25, 26, 27, 28, 29, 30, 31, 32, 33, 34, 35, 36, 37, 38, 39, 40, 41, 42, 43, 44, 45, 46, 47, 48, 49, 50, 51, 52, 53, 54, 55, 56, 57, 58, 59, 60, 61, 62, 63, 64, 65, 66, 67, 68, 69, 70, 71, 72, 73, 74, 75, 76, 77, 78, 79, 80, 81, 82, 83, 84, 85, 86, 87, 88, 89, 90, 91, 92, 93, 94, 95, 96, 97, 98, 99, 100, 101, 102, 103, 104, 105, 106, 107, 108, 109, 110, 111, 112, 113, 114, 115, 116, 117, 118, 119, 120, 121, 122, 123, 124, 125, 126, 127, 128, 129, 130, 131, 132, 133, 134, 135, 136, 137, 138, 139, 140, 141, 142, 143, 144, 145, 146, 147, 148, 149]]
排序后的距离:[[0.0, 0.09999999999999998, 0.1414213562373093, 0.14142135623730964, 0.14142135623730995, 0.14142135623730995, 0.17320508075688743, 0.17320508075688762, 0.22360679774997896, 0.30000000000000016, 0.30000000000000027, 0.316227766016838, 0.33166247903553986, 0.3605551275463989, 0.37416573867739383, 0.3741657386773941, 0.37416573867739483, 0.3872983346207423, 0.41231056256176635, 0.4358898943540679, 0.45825756949558394, 0.4690415759823426, 0.4690415759823426, 0.4690415759823426, 0.46904157598234303, 0.509901951359278, 0.5196152422706632, 0.53851648071345, 0.5385164807134502, 0.5385164807134504, 0.5477225575051662, 0.5477225575051664, 0.58309518948453, 0.5916079783099616, 0.5916079783099616, 0.5916079783099616, 0.6164414002968975, 0.6164414002968979, 0.6244997998398396, 0.648074069840786, 0.648074069840786, 0.7416198487095667, 0.7681145747868601, 0.8062257748298554, 0.866025403784438, 0.8831760866327848, 0.9219544457292882, 0.9949874371066197, 1.1045361017187267, 1.3490737563232043, 2.0904544960366875, 2.345207879911715, 2.387467277262665, 2.4939927826679855, 2.5806975801127883, 2.7018512172212596, 2.703701166919155, 2.8178005607210745, 2.882707061079915, 2.8879058156387303, 2.8948229652260253, 2.9999999999999996, 3.009983388658482, 3.0099833886584824, 3.0215889859476257, 3.057776970284131, 3.07408522978788, 3.0757112998459397, 3.093541659651604, 3.1256999216175574, 3.146426544510455, 3.1527765540868895, 3.228002478313795, 3.3120990323358397, 3.3451457367355464, 3.411744421846396, 3.416138170507745, 3.416138170507745, 3.4351128074635335, 3.5199431813596087, 3.531288716601915, 3.591656999213594, 3.59583091927304, 3.59722114972099, 3.6166282640050254, 3.6180105030251095, 3.627671429443412, 3.6578682316343767, 3.7, 3.749666651850535, 3.7682887362833544, 3.7854986461495406, 3.792097045171708, 3.8535697735995385, 3.8961519477556315, 3.9115214431215897, 3.9774363602702683, 4.003748243833521, 4.007492981902776, 4.028647415696738, 4.047221268969612, 4.047221268969613, 4.109744517606904, 4.135214625627066, 4.1400483088968905, 4.141255848169732, 4.160528812542944, 4.164132562731403, 4.190465367951393, 4.208325082500163, 4.208325082500163, 4.244997055358225, 4.27668095606862, 4.356604182158392, 4.417012565071555, 4.459820624195552, 4.519955751995809, 4.570557952810575, 4.606517122512408, 4.626013402488151, 4.636809247747852, 4.645427859734774, 4.650806381693394, 4.654030511288039, 4.690415759823429, 4.796873982084583, 4.840454524112379, 4.853864439804639, 4.880573736764972, 4.891829923454003, 4.969909455915672, 5.0199601592044525, 5.047771785649585, 5.056678751908213, 5.0970579749498635, 5.121523210920752, 5.136146415358503, 5.2573757712379665, 5.2848841046895245, 5.301886456724625, 5.31224999411737, 5.546169849544818, 5.636488268416782, 5.639148871948673, 5.788782255362521, 6.014149981501959, 6.0950799830683104, 6.211280061307815, 6.240192304729079, 6.498461356351979], [0.0, 0.14142135623730986, 0.1414213562373099, 0.17320508075688784, 0.17320508075688784, 0.17320508075688784, 0.22360679774997896, 0.24494897427831822, 0.3, 0.30000000000000016, 0.31622776601683766, 0.3316624790355407, 0.3464101615137758, 0.3605551275463996, 0.42426406871192834, 0.4582575694955836, 0.4582575694955841, 0.49999999999999983, 0.4999999999999999, 0.5099019513592784, 0.5099019513592785, 0.5099019513592788, 0.529150262212918, 0.5385164807134502, 0.5477225575051659, 0.5477225575051662, 0.5567764362830019, 0.5916079783099616, 0.608276253029822, 0.6480740698407861, 0.6782329983125268, 0.6782329983125268, 0.6782329983125273, 0.7071067811865475, 0.7615773105863909, 0.7810249675906658, 0.7874007874011809, 0.812403840463596, 0.8185352771872454, 0.8366600265340752, 0.8485281374238567, 0.8660254037844388, 0.9848857801796101, 1.0535653752852738, 1.0908712114635715, 1.1489125293076055, 1.1747340124470729, 1.3416407864998738, 1.360147050873544, 1.6278820596099708, 1.9131126469708992, 2.1470910553583886, 2.179449471770337, 2.4186773244895647, 2.4617067250182343, 2.5592967784139455, 2.5787593916455256, 2.7, 2.792848008753788, 2.8053520278211073, 2.8548204847240393, 2.929163703175362, 2.9546573405388314, 2.9698484809834995, 2.971531591620725, 2.9983328701129897, 3.029851481508623, 3.0413812651491097, 3.0740852297878796, 3.0789608636681307, 3.10322412983658, 3.1144823004794877, 3.2434549480453714, 3.2449961479175906, 3.3645207682521443, 3.3911649915626345, 3.398529093593286, 3.433656942677879, 3.4467375879228173, 3.479942528261063, 3.5383612025908264, 3.55808937493144, 3.59722114972099, 3.622154055254966, 3.651027252705737, 3.6565010597564442, 3.6864617182333523, 3.6918829883949464, 3.7000000000000006, 3.712142238654117, 3.788139384975162, 3.811823710509183, 3.8600518131237567, 3.893584466786357, 3.9153543900903784, 3.9786932528155523, 3.9812058474788765, 4.0024992192379, 4.018706259482024, 4.031128874149275, 4.06201920231798, 4.080441152620633, 4.096339829652808, 4.106093033529563, 4.11703777004778, 4.134005321718878, 4.153311931459037, 4.168932717135165, 4.180908992073374, 4.180908992073374, 4.236744032862973, 4.249705872175156, 4.295346318982906, 4.402272140611027, 4.41927595879687, 4.498888751680798, 4.521061822182926, 4.547526800360829, 4.650806381693394, 4.68081189538738, 4.682947789587238, 4.699999999999999, 4.702127178203498, 4.708502946797421, 4.718050444834179, 4.851803788283281, 4.860041152089146, 4.891829923454003, 4.902040391510457, 4.959838707054897, 5.040833264451424, 5.042816673249187, 5.072474741188959, 5.091168824543142, 5.158488150611572, 5.1903757089443925, 5.206726418777926, 5.320714237769211, 5.338539126015655, 5.357238094391549, 5.389805191284746, 5.591958512006325, 5.686826883245173, 5.747173218200404, 5.860034129593443, 6.154673021371647, 6.159545437773796, 6.2617888817813085, 6.369458375717672, 6.5314623171231725]]
训练样本:[[0, 0, 0, 0, 0, 0, 0, 7, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 42, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 59, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 114, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 27, 0, 0, 0, 31, 0, 0, 0, 0, 36, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 59, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0]]
7个临近的距离:[[0.0, 0.09999999999999998, 0.1414213562373093, 0.14142135623730964, 0.14142135623730995, 0.14142135623730995, 0.17320508075688743], [0.0, 0.14142135623730986, 0.1414213562373099, 0.17320508075688784, 0.17320508075688784, 0.17320508075688784, 0.22360679774997896]]
训练样本:[[0, 0, 0, 0, 0, 0, 0], [0, 0, 0, 0, 0, 0, 0]]
准确度:0.0%

欢迎分享,转载请注明来源:内存溢出

原文地址: https://outofmemory.cn/langs/876683.html

(0)
打赏 微信扫一扫 微信扫一扫 支付宝扫一扫 支付宝扫一扫
上一篇 2022-05-13
下一篇 2022-05-13

发表评论

登录后才能评论

评论列表(0条)

保存