• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    迪恩网络公众号

Java Evaluation类代码示例

原作者: [db:作者] 来自: [db:来源] 收藏 邀请

本文整理汇总了Java中weka.classifiers.evaluation.Evaluation的典型用法代码示例。如果您正苦于以下问题:Java Evaluation类的具体用法?Java Evaluation怎么用?Java Evaluation使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。



Evaluation类属于weka.classifiers.evaluation包,在下文中一共展示了Evaluation类的20个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。

示例1: getEvalResultbySMOTE

import weka.classifiers.evaluation.Evaluation; //导入依赖的package包/类
/***
	 * <p>To get 10-fold cross validation in one single arff in <b>path</b></p>
	 * <p>Use C4.5 and <b>SMOTE</b> to classify the dataset.</p>
	 * @param path dataset path
	 * @throws Exception
	 */
	public static void getEvalResultbySMOTE(String path, int index) throws Exception{
		
		Instances ins = DataSource.read(path);
		int numAttr = ins.numAttributes();
		ins.setClassIndex(numAttr - 1);
		
		SMOTE smote = new SMOTE();
		smote.setInputFormat(ins);
		
		/** classifiers setting*/
		J48 j48 = new J48();
//		j48.setConfidenceFactor(0.4f);
		j48.buildClassifier(ins);

		FilteredClassifier fc = new FilteredClassifier();
		fc.setClassifier(j48);
		fc.setFilter(smote);
			
		Evaluation eval = new Evaluation(ins);	
		eval.crossValidateModel(fc, ins, 10, new Random(1));
		
//		System.out.printf(" %4.3f %4.3f %4.3f", eval.precision(0), eval.recall(0), eval.fMeasure(0));
//		System.out.printf(" %4.3f %4.3f %4.3f", eval.precision(1), eval.recall(1), eval.fMeasure(1));
//		System.out.printf(" %4.3f \n\n", (1-eval.errorRate()));
		results[index][0] = eval.precision(0);
		results[index][1] = eval.recall(0);
		results[index][2] = eval.fMeasure(0);
		results[index][3] = eval.precision(1);
		results[index][4] = eval.recall(1);
		results[index][5] = eval.fMeasure(1);
		results[index][6] = 1-eval.errorRate();
				
	}
 
开发者ID:Gu-Youngfeng,项目名称:CraTer,代码行数:40,代码来源:ImbalanceProcessingAve.java


示例2: evalToBytes

import weka.classifiers.evaluation.Evaluation; //导入依赖的package包/类
/**
 * Serializes and compresses an Evaluation object to an array of bytes
 * 
 * @param eval the Evaluation object to serialize
 * @return an array of bytes
 * @throws IOException if a problem occurs
 */
protected static byte[] evalToBytes(Evaluation eval) throws IOException {
  ObjectOutputStream p = null;
  byte[] bytes = null;

  try {
    ByteArrayOutputStream ostream = new ByteArrayOutputStream();
    OutputStream os = ostream;

    p = new ObjectOutputStream(new BufferedOutputStream(new GZIPOutputStream(
      os)));
    p.writeObject(eval);
    p.flush();
    p.close();
    bytes = ostream.toByteArray();

    p = null;
  } finally {
    if (p != null) {
      p.close();
    }
  }

  return bytes;
}
 
开发者ID:mydzigear,项目名称:repo.kmeanspp.silhouette_score,代码行数:32,代码来源:WekaFoldBasedClassifierEvaluationHadoopMapper.java


示例3: deserialize

import weka.classifiers.evaluation.Evaluation; //导入依赖的package包/类
/**
 * Helper function to deserialize a Evaluation object from an array of bytes
 * 
 * @param bytes the array containing the compressed serialized Evaluation
 *          object
 * @return the deserialized Evaluation object
 * @throws Exception if a problem occurs
 */
protected Evaluation deserialize(byte[] bytes) throws Exception {
  ByteArrayInputStream istream = new ByteArrayInputStream(bytes);
  ObjectInputStream p = null;
  Object toReturn = null;

  try {
    p = new ObjectInputStream(new BufferedInputStream(new GZIPInputStream(
      istream)));

    toReturn = p.readObject();
    if (!(toReturn instanceof Evaluation)) {
      throw new Exception("Object deserialized was not an Evaluation object!");
    }
  } finally {
    if (p != null) {
      p.close();
    }
  }

  return (Evaluation) toReturn;
}
 
开发者ID:mydzigear,项目名称:repo.kmeanspp.silhouette_score,代码行数:30,代码来源:WekaClassifierEvaluationHadoopReducer.java


示例4: aggregate

import weka.classifiers.evaluation.Evaluation; //导入依赖的package包/类
/**
 * Aggregate a list of Evaluation objects.
 * 
 * @param evals the list of Evaluation objects to aggregate
 * @return a single eval Evaluation
 * @throws Exception if a problem occurs
 */
public Evaluation aggregate(List<Evaluation> evals) throws Exception {

  if (evals.size() == 0) {
    throw new Exception("Nothing to aggregate!");
  }

  AggregateableEvaluation aggEval = new AggregateableEvaluation(evals.get(0));

  for (Evaluation e : evals) {
    aggEval.aggregate(e);
  }

  aggEval.finalizeAggregation();

  return aggEval;
}
 
开发者ID:mydzigear,项目名称:repo.kmeanspp.silhouette_score,代码行数:24,代码来源:WekaClassifierEvaluationReduceTask.java


示例5: main

import weka.classifiers.evaluation.Evaluation; //导入依赖的package包/类
public static void main(String[] args) throws Exception{
	
	String databasePath = "data/features.arff";
	
	// Load the data in arff format
	Instances data = new Instances(new BufferedReader(new FileReader(databasePath)));
	
	// Set class the last attribute as class
	data.setClassIndex(data.numAttributes() - 1);

	// Build a basic decision tree model
	String[] options = new String[]{};
	J48 model = new J48();
	model.setOptions(options);
	model.buildClassifier(data);
	
	// Output decision tree
	System.out.println("Decision tree model:\n"+model);
	
	// Output source code implementing the decision tree
	System.out.println("Source code:\n"+model.toSource("ActivityRecognitionEngine"));
	
	// Check accuracy of model using 10-fold cross-validation
	Evaluation eval = new Evaluation(data);
	eval.crossValidateModel(model, data, 10, new Random(1), new String[] {});
	System.out.println("Model performance:\n"+eval.toSummaryString());
	
	String[] activities = new String[]{"Walk", "Walk", "Walk", "Run", "Walk", "Run", "Run", "Sit", "Sit", "Sit"};
	DiscreteLowPass dlpFilter = new DiscreteLowPass(3);
	for(String str : activities){
		System.out.println(str +" -> "+ dlpFilter.filter(str));
	}
	
}
 
开发者ID:PacktPublishing,项目名称:Machine-Learning-End-to-Endguide-for-Java-developers,代码行数:35,代码来源:ActivityRecognition.java


示例6: getEvalResultbyNo

import weka.classifiers.evaluation.Evaluation; //导入依赖的package包/类
/***
	 * <p>To get 10-fold cross validation in one single arff in <b>path</b></p>
	 * <p>Only use C4.5 to classify the dataset.</p>
	 * @param path dataset path
	 * @throws Exception
	 */
	public static void getEvalResultbyNo(String path, int index) throws Exception{
		
		Instances ins = DataSource.read(path);
		int numAttr = ins.numAttributes();
		ins.setClassIndex(numAttr - 1);
		
		/** classifiers setting*/
		J48 j48 = new J48();
//		j48.setConfidenceFactor(0.4f);
		j48.buildClassifier(ins);
		
		Evaluation eval = new Evaluation(ins);	
		eval.crossValidateModel(j48, ins, 10, new Random(1));
		
//		System.out.printf(" %4.3f %4.3f %4.3f", eval.precision(0), eval.recall(0), eval.fMeasure(0));
//		System.out.printf(" %4.3f %4.3f %4.3f", eval.precision(1), eval.recall(1), eval.fMeasure(1));
//		System.out.printf(" %4.3f \n\n", (1-eval.errorRate()));
		results[index][0] = eval.precision(0);
		results[index][1] = eval.recall(0);
		results[index][2] = eval.fMeasure(0);
		results[index][3] = eval.precision(1);
		results[index][4] = eval.recall(1);
		results[index][5] = eval.fMeasure(1);
		results[index][6] = 1-eval.errorRate();
			
	}
 
开发者ID:Gu-Youngfeng,项目名称:CraTer,代码行数:33,代码来源:ImbalanceProcessingAve.java


示例7: getEvalResultbyResampling

import weka.classifiers.evaluation.Evaluation; //导入依赖的package包/类
/***
	 * <p>To get 10-fold cross validation in one single arff in <b>path</b></p>
	 * <p>Use C4.5 and <b>Resampling</b> to classify the dataset.</p>
	 * @param path dataset path
	 * @throws Exception
	 */
	public static void getEvalResultbyResampling(String path, int index) throws Exception{
		
		Instances ins = DataSource.read(path);
		int numAttr = ins.numAttributes();
		ins.setClassIndex(numAttr - 1);
		
		Resample resample = new Resample();
		resample.setInputFormat(ins);
		
		/** classifiers setting*/
		J48 j48 = new J48();
//		j48.setConfidenceFactor(0.4f);
		j48.buildClassifier(ins);

		FilteredClassifier fc = new FilteredClassifier();
		fc.setClassifier(j48);
		fc.setFilter(resample);
			
		Evaluation eval = new Evaluation(ins);	
		eval.crossValidateModel(fc, ins, 10, new Random(1));
		
//		System.out.printf(" %4.3f %4.3f %4.3f", eval.precision(0), eval.recall(0), eval.fMeasure(0));
//		System.out.printf(" %4.3f %4.3f %4.3f", eval.precision(1), eval.recall(1), eval.fMeasure(1));
//		System.out.printf(" %4.3f \n\n", (1-eval.errorRate()));
		results[index][0] = eval.precision(0);
		results[index][1] = eval.recall(0);
		results[index][2] = eval.fMeasure(0);
		results[index][3] = eval.precision(1);
		results[index][4] = eval.recall(1);
		results[index][5] = eval.fMeasure(1);
		results[index][6] = 1-eval.errorRate();
			
	}
 
开发者ID:Gu-Youngfeng,项目名称:CraTer,代码行数:40,代码来源:ImbalanceProcessingAve.java


示例8: getEvalResultbyCost

import weka.classifiers.evaluation.Evaluation; //导入依赖的package包/类
/***
	 * <p>To get 10-fold cross validation in one single arff in <b>path</b></p>
	 * <p>Use C4.5 and <b>Cost-sensitive learning</b> to classify the dataset.</p>
	 * @param path dataset path
	 * @throws Exception
	 */
	public static void getEvalResultbyCost(String path, int index) throws Exception{
		
		Instances ins = DataSource.read(path);
		int numAttr = ins.numAttributes();
		ins.setClassIndex(numAttr - 1);
		
		/**Classifier setting*/
		J48 j48 = new J48();
//		j48.setConfidenceFactor(0.4f);
		j48.buildClassifier(ins);
		
		CostSensitiveClassifier csc = new CostSensitiveClassifier();
		csc.setClassifier(j48);
		csc.setCostMatrix(new CostMatrix(new BufferedReader(new FileReader("files/costm"))));
		
		Evaluation eval = new Evaluation(ins);
		
		eval.crossValidateModel(csc, ins, 10, new Random(1));
		
//		System.out.printf(" %4.3f %4.3f %4.3f", eval.precision(0), eval.recall(0), eval.fMeasure(0));
//		System.out.printf(" %4.3f %4.3f %4.3f", eval.precision(1), eval.recall(1), eval.fMeasure(1));
//		System.out.printf(" %4.3f \n\n", (1-eval.errorRate()));
		results[index][0] = eval.precision(0);
		results[index][1] = eval.recall(0);
		results[index][2] = eval.fMeasure(0);
		results[index][3] = eval.precision(1);
		results[index][4] = eval.recall(1);
		results[index][5] = eval.fMeasure(1);
		results[index][6] = 1-eval.errorRate();
			
	}
 
开发者ID:Gu-Youngfeng,项目名称:CraTer,代码行数:38,代码来源:ImbalanceProcessingAve.java


示例9: getEvalResultbyDefault

import weka.classifiers.evaluation.Evaluation; //导入依赖的package包/类
/***
	 * <p>To get 10-fold cross validation in one single arff in <b>path</b></p>
	 * <p>Use C4.5 and <b>SMOTE</b> to classify the dataset.</p>
	 * @param path dataset path
	 * @throws Exception
	 */
	public static void getEvalResultbyDefault(String path, int index) throws Exception{
		
		Instances ins = DataSource.read(path);
		int numAttr = ins.numAttributes();
		ins.setClassIndex(numAttr - 1);
		
		SMOTE smote = new SMOTE();
		smote.setInputFormat(ins);
		
		/** classifiers setting*/
		J48 j48 = new J48();
//		j48.setConfidenceFactor(0.4f);
		j48.buildClassifier(ins);

		FilteredClassifier fc = new FilteredClassifier();
		fc.setClassifier(j48);
		fc.setFilter(smote);
			
		Evaluation eval = new Evaluation(ins);	
		eval.crossValidateModel(fc, ins, 10, new Random(1));
		
//		System.out.printf(" %4.3f %4.3f %4.3f", eval.precision(0), eval.recall(0), eval.fMeasure(0));
//		System.out.printf(" %4.3f %4.3f %4.3f", eval.precision(1), eval.recall(1), eval.fMeasure(1));
//		System.out.printf(" %4.3f \n\n", (1-eval.errorRate()));
		results[index][0] = eval.precision(0);
		results[index][1] = eval.recall(0);
		results[index][2] = eval.fMeasure(0);
		results[index][3] = eval.precision(1);
		results[index][4] = eval.recall(1);
		results[index][5] = eval.fMeasure(1);
		results[index][6] = 1-eval.errorRate();
				
	}
 
开发者ID:Gu-Youngfeng,项目名称:CraTer,代码行数:40,代码来源:FeatureSelectionAve.java


示例10: testCOMT2

import weka.classifiers.evaluation.Evaluation; //导入依赖的package包/类
public static void testCOMT2() throws Exception{
	BestConf bestconf = new BestConf();
	Instances trainingSet = DataIOFile.loadDataFromArffFile("data/trainingBestConf0.arff");
	trainingSet.setClassIndex(trainingSet.numAttributes()-1);
	
	Instances samplePoints = LHSInitializer.getMultiDimContinuous(bestconf.getAttributes(), InitialSampleSetSize, false);
	samplePoints.insertAttributeAt(trainingSet.classAttribute(), samplePoints.numAttributes());
	samplePoints.setClassIndex(samplePoints.numAttributes()-1);
	
	COMT2 comt = new COMT2(samplePoints, COMT2Iteration);
	
	comt.buildClassifier(trainingSet);
	
	Evaluation eval = new Evaluation(trainingSet);
	eval.evaluateModel(comt, trainingSet);
	System.err.println(eval.toSummaryString());
	
	Instance best = comt.getInstanceWithPossibleMaxY(samplePoints.firstInstance());
	Instances bestInstances = new Instances(trainingSet,2);
	bestInstances.add(best);
	DataIOFile.saveDataToXrffFile("data/trainingBestConf_COMT2.arff", bestInstances);
	
	//now we output the training set with the class value updated as the predicted value
	Instances output = new Instances(trainingSet, trainingSet.numInstances());
	Enumeration<Instance> enu = trainingSet.enumerateInstances();
	while(enu.hasMoreElements()){
		Instance ins = enu.nextElement();
		double[] values = ins.toDoubleArray();
		values[values.length-1] = comt.classifyInstance(ins);
		output.add(ins.copy(values));
	}
	DataIOFile.saveDataToXrffFile("data/trainingBestConf0_predict.xrff", output);
}
 
开发者ID:zhuyuqing,项目名称:bestconf,代码行数:34,代码来源:BestConf.java


示例11: cleanup

import weka.classifiers.evaluation.Evaluation; //导入依赖的package包/类
@Override
public void cleanup(Context context) throws IOException {
  try {
    // aggregate the stats over all folds in this chunk
    AggregateableEvaluation agg = null;
    for (int i = 0; i < m_totalFolds; i++) {
      if (!m_classifierIsUpdateable || m_forceBatch) {
        String modelToLoad = "" + (i + 1) + "_" + m_originalModelFileName;
        Classifier foldModel = WekaClassifierHadoopMapper
          .loadClassifier(modelToLoad);
        m_tasks[i].setClassifier(foldModel);
      }

      m_tasks[i].finalizeTask();
      Evaluation eval = m_tasks[i].getEvaluation();

      // save memory
      m_tasks[i] = null;

      if (agg == null) {
        agg = new AggregateableEvaluation(eval);
      }
      agg.aggregate(eval);
    }

    if (agg != null) {
      byte[] bytes = evalToBytes(agg);
      String constantKey = "evaluation";
      Text key = new Text();
      key.set(constantKey);

      BytesWritable value = new BytesWritable();
      value.set(bytes, 0, bytes.length);

      context.write(key, value);
    }
  } catch (Exception ex) {
    throw new IOException(ex);
  }
}
 
开发者ID:mydzigear,项目名称:repo.kmeanspp.silhouette_score,代码行数:41,代码来源:WekaFoldBasedClassifierEvaluationHadoopMapper.java


示例12: getEvalResultbyChiSquare

import weka.classifiers.evaluation.Evaluation; //导入依赖的package包/类
/***
	 * <p>To get 10-fold cross validation in one single arff in <b>path</b></p>
	 * <p>Use C4.5 and <b>SMOTE</b>, combined with <b>Chi-Square</b> to classify the dataset.</p>
	 * @param path dataset path
	 * @throws Exception
	 */
	public static void getEvalResultbyChiSquare(String path, int index) throws Exception{
		
		Instances ins = DataSource.read(path);
		int numAttr = ins.numAttributes();
		ins.setClassIndex(numAttr - 1);
		
		/**chi-squared filter to process the whole dataset first*/
		ChiSquaredAttributeEval evall = new ChiSquaredAttributeEval();	
		Ranker ranker = new Ranker();
		AttributeSelection selector = new AttributeSelection();
		
		selector.setEvaluator(evall);
		selector.setSearch(ranker);
		selector.setInputFormat(ins);
		ins = Filter.useFilter(ins, selector);
		
		SMOTE smote = new SMOTE();
		smote.setInputFormat(ins);
		
		/** classifiers setting*/
		J48 j48 = new J48();
//		j48.setConfidenceFactor(0.4f);
		j48.buildClassifier(ins);

		FilteredClassifier fc = new FilteredClassifier();
		fc.setClassifier(j48);
		fc.setFilter(smote);
			
		Evaluation eval = new Evaluation(ins);	
		eval.crossValidateModel(fc, ins, 10, new Random(1));
		
//		System.out.printf(" %4.3f %4.3f %4.3f", eval.precision(0), eval.recall(0), eval.fMeasure(0));
//		System.out.printf(" %4.3f %4.3f %4.3f", eval.precision(1), eval.recall(1), eval.fMeasure(1));
//		System.out.printf(" %4.3f \n\n", (1-eval.errorRate()));
		results[index][0] = eval.precision(0);
		results[index][1] = eval.recall(0);
		results[index][2] = eval.fMeasure(0);
		results[index][3] = eval.precision(1);
		results[index][4] = eval.recall(1);
		results[index][5] = eval.fMeasure(1);
		results[index][6] = 1-eval.errorRate();
				
	}
 
开发者ID:Gu-Youngfeng,项目名称:CraTer,代码行数:50,代码来源:FeatureSelectionAve.java


示例13: getEvalResultbyInfoGain

import weka.classifiers.evaluation.Evaluation; //导入依赖的package包/类
/***
	 * <p>To get 10-fold cross validation in one single arff in <b>path</b></p>
	 * <p>Use C4.5 and <b>SMOTE</b>, combined with <b>Information Gain</b> to classify the dataset.</p>
	 * @param path dataset path
	 * @throws Exception
	 */
	public static void getEvalResultbyInfoGain(String path, int index) throws Exception{
		
		Instances ins = DataSource.read(path);
		int numAttr = ins.numAttributes();
		ins.setClassIndex(numAttr - 1);
		
		/**information gain filter to process the whole dataset first*/
		InfoGainAttributeEval evall = new InfoGainAttributeEval();
		Ranker ranker = new Ranker();
		AttributeSelection selector = new AttributeSelection();
		
		selector.setEvaluator(evall);
		selector.setSearch(ranker);
		selector.setInputFormat(ins);
		ins = Filter.useFilter(ins, selector);
		
		SMOTE smote = new SMOTE();
		smote.setInputFormat(ins);
		
		/** classifiers setting*/
		J48 j48 = new J48();
//		j48.setConfidenceFactor(0.4f);
		j48.buildClassifier(ins);

		FilteredClassifier fc = new FilteredClassifier();
		fc.setClassifier(j48);
		fc.setFilter(smote);
			
		Evaluation eval = new Evaluation(ins);	
		eval.crossValidateModel(fc, ins, 10, new Random(1));
		
//		System.out.printf(" %4.3f %4.3f %4.3f", eval.precision(0), eval.recall(0), eval.fMeasure(0));
//		System.out.printf(" %4.3f %4.3f %4.3f", eval.precision(1), eval.recall(1), eval.fMeasure(1));
//		System.out.printf(" %4.3f \n\n", (1-eval.errorRate()));
		results[index][0] = eval.precision(0);
		results[index][1] = eval.recall(0);
		results[index][2] = eval.fMeasure(0);
		results[index][3] = eval.precision(1);
		results[index][4] = eval.recall(1);
		results[index][5] = eval.fMeasure(1);
		results[index][6] = 1-eval.errorRate();
				
	}
 
开发者ID:Gu-Youngfeng,项目名称:CraTer,代码行数:50,代码来源:FeatureSelectionAve.java


示例14: getEvalResultbyGainRatio

import weka.classifiers.evaluation.Evaluation; //导入依赖的package包/类
/***
	 * <p>To get 10-fold cross validation in one single arff in <b>path</b></p>
	 * <p>Use C4.5 and <b>SMOTE</b>, combined with <b>Information Gain Ratio</b> to classify the dataset.</p>
	 * @param path dataset path
	 * @throws Exception
	 */
	public static void getEvalResultbyGainRatio(String path, int index) throws Exception{
		
		Instances ins = DataSource.read(path);
		int numAttr = ins.numAttributes();
		ins.setClassIndex(numAttr - 1);
		
		/**information gain ratio filter to process the whole dataset first*/
		GainRatioAttributeEval evall = new GainRatioAttributeEval();
		Ranker ranker = new Ranker();
		AttributeSelection selector = new AttributeSelection();
		
		selector.setEvaluator(evall);
		selector.setSearch(ranker);
		selector.setInputFormat(ins);
		ins = Filter.useFilter(ins, selector);
		
		SMOTE smote = new SMOTE();
		smote.setInputFormat(ins);
		
		/** classifiers setting*/
		J48 j48 = new J48();
//		j48.setConfidenceFactor(0.4f);
		j48.buildClassifier(ins);

		FilteredClassifier fc = new FilteredClassifier();
		fc.setClassifier(j48);
		fc.setFilter(smote);
			
		Evaluation eval = new Evaluation(ins);	
		eval.crossValidateModel(fc, ins, 10, new Random(1));
		
//		System.out.printf(" %4.3f %4.3f %4.3f", eval.precision(0), eval.recall(0), eval.fMeasure(0));
//		System.out.printf(" %4.3f %4.3f %4.3f", eval.precision(1), eval.recall(1), eval.fMeasure(1));
//		System.out.printf(" %4.3f \n\n", (1-eval.errorRate()));
		results[index][0] = eval.precision(0);
		results[index][1] = eval.recall(0);
		results[index][2] = eval.fMeasure(0);
		results[index][3] = eval.precision(1);
		results[index][4] = eval.recall(1);
		results[index][5] = eval.fMeasure(1);
		results[index][6] = 1-eval.errorRate();
				
	}
 
开发者ID:Gu-Youngfeng,项目名称:CraTer,代码行数:50,代码来源:FeatureSelectionAve.java


示例15: getEvalResultbyCorrelation

import weka.classifiers.evaluation.Evaluation; //导入依赖的package包/类
/***
	 * <p>To get 10-fold cross validation in one single arff in <b>path</b></p>
	 * <p>Use C4.5 and <b>SMOTE</b>, combined with <b>Correlation</b> to classify the dataset.</p>
	 * @param path dataset path
	 * @throws Exception
	 */
	public static void getEvalResultbyCorrelation(String path, int index) throws Exception{
		
		Instances ins = DataSource.read(path);
		int numAttr = ins.numAttributes();
		ins.setClassIndex(numAttr - 1);
		
		/** correlation filter to process the whole dataset first*/
		CorrelationAttributeEval evall = new CorrelationAttributeEval();
		Ranker ranker = new Ranker();
		AttributeSelection selector = new AttributeSelection();
		
		selector.setEvaluator(evall);
		selector.setSearch(ranker);
		selector.setInputFormat(ins);
		ins = Filter.useFilter(ins, selector);
		
		SMOTE smote = new SMOTE();
		smote.setInputFormat(ins);
		
		/** classifiers setting*/
		J48 j48 = new J48();
//		j48.setConfidenceFactor(0.4f);
		j48.buildClassifier(ins);

		FilteredClassifier fc = new FilteredClassifier();
		fc.setClassifier(j48);
		fc.setFilter(smote);
			
		Evaluation eval = new Evaluation(ins);	
		eval.crossValidateModel(fc, ins, 10, new Random(1));
		
//		System.out.printf(" %4.3f %4.3f %4.3f", eval.precision(0), eval.recall(0), eval.fMeasure(0));
//		System.out.printf(" %4.3f %4.3f %4.3f", eval.precision(1), eval.recall(1), eval.fMeasure(1));
//		System.out.printf(" %4.3f \n\n", (1-eval.errorRate()));
		results[index][0] = eval.precision(0);
		results[index][1] = eval.recall(0);
		results[index][2] = eval.fMeasure(0);
		results[index][3] = eval.precision(1);
		results[index][4] = eval.recall(1);
		results[index][5] = eval.fMeasure(1);
		results[index][6] = 1-eval.errorRate();
				
	}
 
开发者ID:Gu-Youngfeng,项目名称:CraTer,代码行数:50,代码来源:FeatureSelectionAve.java


示例16: getEvalResultbyReliefF

import weka.classifiers.evaluation.Evaluation; //导入依赖的package包/类
/***
	 * <p>To get 10-fold cross validation in one single arff in <b>path</b></p>
	 * <p>Use C4.5 and <b>SMOTE</b>, combined with <b>ReliefF</b> to classify the dataset.</p>
	 * @param path dataset path
	 * @throws Exception
	 */
	public static void getEvalResultbyReliefF(String path, int index) throws Exception{
		
		Instances ins = DataSource.read(path);
		int numAttr = ins.numAttributes();
		ins.setClassIndex(numAttr - 1);
		
		/** correlation filter to process the whole dataset first*/
		ReliefFAttributeEval evall = new ReliefFAttributeEval();
		Ranker ranker = new Ranker();
		AttributeSelection selector = new AttributeSelection();
		
		selector.setEvaluator(evall);
		selector.setSearch(ranker);
		selector.setInputFormat(ins);
		ins = Filter.useFilter(ins, selector);
		
		SMOTE smote = new SMOTE();
		smote.setInputFormat(ins);
		
		/** classifiers setting*/
		J48 j48 = new J48();
//		j48.setConfidenceFactor(0.4f);
		j48.buildClassifier(ins);

		FilteredClassifier fc = new FilteredClassifier();
		fc.setClassifier(j48);
		fc.setFilter(smote);
			
		Evaluation eval = new Evaluation(ins);	
		eval.crossValidateModel(fc, ins, 10, new Random(1));
		
//		System.out.printf(" %4.3f %4.3f %4.3f", eval.precision(0), eval.recall(0), eval.fMeasure(0));
//		System.out.printf(" %4.3f %4.3f %4.3f", eval.precision(1), eval.recall(1), eval.fMeasure(1));
//		System.out.printf(" %4.3f \n\n", (1-eval.errorRate()));
		results[index][0] = eval.precision(0);
		results[index][1] = eval.recall(0);
		results[index][2] = eval.fMeasure(0);
		results[index][3] = eval.precision(1);
		results[index][4] = eval.recall(1);
		results[index][5] = eval.fMeasure(1);
		results[index][6] = 1-eval.errorRate();
				
	}
 
开发者ID:Gu-Youngfeng,项目名称:CraTer,代码行数:50,代码来源:FeatureSelectionAve.java


示例17: showFolds

import weka.classifiers.evaluation.Evaluation; //导入依赖的package包/类
/***
	 * <p>Using Feature Selection method to get 10 folds results in the given project</p>
	 * @param path project path
	 * @param sel label means we use the Feature Selection method
	 * @throws Exception 
	 */
	public static void showFolds(String path, int k, int flag) throws Exception{
			
		Instances data1 = DataSource.read(path);
		data1.setClassIndex(data1.numAttributes()-1);
		if(!data1.classAttribute().isNominal()) // in case of noisy data, return
			return;
		
		/** Feature Selection: Correlation */
		CorrelationAttributeEval evall = new CorrelationAttributeEval();
		Ranker ranker = new Ranker();
		AttributeSelection selector = new AttributeSelection();		
		selector.setEvaluator(evall);
		selector.setSearch(ranker);
		selector.setInputFormat(data1);
		data1 = Filter.useFilter(data1, selector);

		/** Randomize and stratify the dataset*/
		data1.randomize(new Random(1)); 	
		data1.stratify(10);	// 10 folds
		
		double[][] matrix = new double[10][7];	
		
		for(int i=0; i<10; i++){ // To calculate the results in each fold
			
			Instances test = data1.testCV(10, i);
			Instances train = data1.trainCV(10, i);
			
			/** SMOTE */
			SMOTE smote = new SMOTE();
			smote.setInputFormat(train);
			train = Filter.useFilter(train, smote);

			/** C4.5 */
			J48 rf = new J48();
//			RandomForest rf = new RandomForest();
//			rf.setNumIterations(300);
			rf.buildClassifier(train);
			
			Evaluation eval = new Evaluation(train);
			eval.evaluateModel(rf, test); 
					
			matrix[i][6] = 1-eval.errorRate();
			
			matrix[i][0] = eval.precision(0);
			
			matrix[i][1] = eval.recall(0);
			
			matrix[i][2] = eval.fMeasure(0);
			
			matrix[i][3] = eval.precision(1);
			
			matrix[i][4] = eval.recall(1);
			
			matrix[i][5] = eval.fMeasure(1);
			
		}
		
		for(int i=0;i<10;i++){
			for(int j=0;j<7;j++){
				System.out.printf("%15.8f", matrix[i][j]);
			}
			System.out.println("");
		}
	}
 
开发者ID:Gu-Youngfeng,项目名称:CraTer,代码行数:71,代码来源:FoldResultsAve.java


示例18: learnParameters

import weka.classifiers.evaluation.Evaluation; //导入依赖的package包/类
/**
 * 
 * Learns the rule from parsed features in a cross validation and the set
 * parameters. Additionally feature subset selection is conducted, if the
 * parameters this.forwardSelection or this.backwardSelection are set
 * accordingly.
 * 
 * @param features
 *            Contains features to learn a classifier
 */

@Override
public Performance learnParameters(FeatureVectorDataSet features) {
	// create training
	Instances trainingData = transformToWeka(features, this.trainingSet);

	try {
		Evaluation eval = new Evaluation(trainingData);
		// apply feature subset selection
		if (this.forwardSelection || this.backwardSelection) {

			GreedyStepwise search = new GreedyStepwise();
			search.setSearchBackwards(this.backwardSelection);

			this.fs = new AttributeSelection();
			WrapperSubsetEval wrapper = new WrapperSubsetEval();

			// Do feature subset selection, but using a 10-fold cross
			// validation
			wrapper.buildEvaluator(trainingData);
			wrapper.setClassifier(this.classifier);
			wrapper.setFolds(10);
			wrapper.setThreshold(0.01);

			this.fs.setEvaluator(wrapper);
			this.fs.setSearch(search);

			this.fs.SelectAttributes(trainingData);

			trainingData = fs.reduceDimensionality(trainingData);

		}
		// perform 10-fold Cross Validation to evaluate classifier
		eval.crossValidateModel(this.classifier, trainingData, 10, new Random(1));
		System.out.println(eval.toSummaryString("\nResults\n\n", false));
		
		this.classifier.buildClassifier(trainingData);
		
		int truePositive = (int) eval.numTruePositives(trainingData.classIndex());
		int falsePositive = (int) eval.numFalsePositives(trainingData.classIndex());
		int falseNegative = (int) eval.numFalseNegatives(trainingData.classIndex());
		Performance performance = new Performance(truePositive, truePositive + falsePositive,
				truePositive + falseNegative);

		return performance;

	} catch (Exception e) {
		e.printStackTrace();
		return null;
	}
}
 
开发者ID:olehmberg,项目名称:winter,代码行数:62,代码来源:WekaMatchingRule.java


示例19: main

import weka.classifiers.evaluation.Evaluation; //导入依赖的package包/类
public static void main(String[] args) {
  try {
    Instances inst =
      new Instances(new java.io.BufferedReader(
        new java.io.FileReader(args[0])));
    inst.setClassIndex(inst.numAttributes() - 1);

    weka.classifiers.evaluation.AggregateableEvaluation agg = null;

    WekaClassifierEvaluationMapTask task =
      new WekaClassifierEvaluationMapTask();
    weka.classifiers.trees.J48 classifier = new weka.classifiers.trees.J48();

    WekaClassifierMapTask trainer = new WekaClassifierMapTask();
    trainer.setClassifier(classifier);
    trainer.setTotalNumFolds(10);
    for (int i = 0; i < 10; i++) {
      System.err.println("Processing fold " + (i + 1));
      trainer.setFoldNumber((i + 1));
      trainer.setup(new Instances(inst, 0));
      trainer.addToTrainingHeader(inst);
      trainer.finalizeTask();

      Classifier c = trainer.getClassifier();

      task.setClassifier(c);
      task.setTotalNumFolds(10);
      task.setFoldNumber((i + 1));

      // TODO set the priors properly here.
      task.setup(new Instances(inst, 0), null, -1, 1L, 0);
      for (int j = 0; j < inst.numInstances(); j++) {
        task.processInstance(inst.instance(j));
      }
      task.finalizeTask();

      Evaluation eval = task.getEvaluation();
      if (agg == null) {
        agg = new weka.classifiers.evaluation.AggregateableEvaluation(eval);
      }
      agg.aggregate(eval);
    }

    System.err.println(agg.toSummaryString());
    System.err.println("\n" + agg.toClassDetailsString());
  } catch (Exception ex) {
    ex.printStackTrace();
  }
}
 
开发者ID:mydzigear,项目名称:repo.kmeanspp.silhouette_score,代码行数:50,代码来源:WekaClassifierEvaluationMapTask.java


示例20: testCrossValidateBatchMapOnly

import weka.classifiers.evaluation.Evaluation; //导入依赖的package包/类
@Test
public void testCrossValidateBatchMapOnly() throws Exception {
  Instances train = new Instances(new BufferedReader(new StringReader(
    CorrelationMatrixMapTaskTest.IRIS)));

  train.setClassIndex(train.numAttributes() - 1);

  WekaClassifierEvaluationMapTask evaluator = new WekaClassifierEvaluationMapTask();
  WekaClassifierMapTask trainer = new WekaClassifierMapTask();
  trainer.setClassifier(new weka.classifiers.trees.J48());
  trainer.setTotalNumFolds(10);

  for (int i = 0; i < 10; i++) {
    trainer.setFoldNumber((i + 1));
    trainer.setup(new Instances(train, 0));
    trainer.addToTrainingHeader(train);
    trainer.finalizeTask();

    Classifier c = trainer.getClassifier();

    evaluator.setClassifier(c);
    evaluator.setTotalNumFolds(10);
    evaluator.setFoldNumber(i + 1);

    // priors for iris (just using priors + count from all the data for
    // simplicity)
    double[] priors = { 50.0, 50.0, 50.0 };
    evaluator.setup(new Instances(train, 0), priors, 150, 1L, 0);

    for (int j = 0; j < train.numInstances(); j++) {
      evaluator.processInstance(train.instance(j));
    }
    evaluator.finalizeTask();

    Evaluation eval = evaluator.getEvaluation();
    assertTrue(eval != null);

    // there should be predictions for exactly 15 instances per test fold
    assertEquals(15, (int) eval.numInstances());

    // there shouldn't be any AUC/AUPRC stats computed
    assertTrue(Utils.isMissingValue(eval.areaUnderROC(0)));
  }
}
 
开发者ID:mydzigear,项目名称:repo.kmeanspp.silhouette_score,代码行数:45,代码来源:WekaClassifierEvaluationTest.java



注:本文中的weka.classifiers.evaluation.Evaluation类示例整理自Github/MSDocs等源码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。


鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
上一篇:
Java InjectViewState类代码示例发布时间:2022-05-22
下一篇:
Java Configurator类代码示例发布时间:2022-05-22
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap