• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    迪恩网络公众号

Java MultivariateDifferentiableVectorFunction类代码示例

原作者: [db:作者] 来自: [db:来源] 收藏 邀请

本文整理汇总了Java中org.apache.commons.math3.analysis.differentiation.MultivariateDifferentiableVectorFunction的典型用法代码示例。如果您正苦于以下问题:Java MultivariateDifferentiableVectorFunction类的具体用法?Java MultivariateDifferentiableVectorFunction怎么用?Java MultivariateDifferentiableVectorFunction使用的例子?那么恭喜您, 这里精选的类代码示例或许可以为您提供帮助。



MultivariateDifferentiableVectorFunction类属于org.apache.commons.math3.analysis.differentiation包,在下文中一共展示了MultivariateDifferentiableVectorFunction类的10个代码示例,这些例子默认根据受欢迎程度排序。您可以为喜欢或者感觉有用的代码点赞,您的评价将有助于我们的系统推荐出更棒的Java代码示例。

示例1: doTestStRD

import org.apache.commons.math3.analysis.differentiation.MultivariateDifferentiableVectorFunction; //导入依赖的package包/类
public void doTestStRD(final StatisticalReferenceDataset dataset,
    final double errParams, final double errParamsSd) {
    final AbstractLeastSquaresOptimizer optimizer = createOptimizer();
    final double[] w = new double[dataset.getNumObservations()];
    Arrays.fill(w, 1.0);

    final double[][] data = dataset.getData();
    final double[] initial = dataset.getStartingPoint(0);
    final MultivariateDifferentiableVectorFunction problem;
    problem = dataset.getLeastSquaresProblem();
    final PointVectorValuePair optimum;
    optimum = optimizer.optimize(100, problem, data[1], w, initial);

    final double[] actual = optimum.getPoint();
    for (int i = 0; i < actual.length; i++) {
        double expected = dataset.getParameter(i);
        double delta = FastMath.abs(errParams * expected);
        Assert.assertEquals(dataset.getName() + ", param #" + i,
                            expected, actual[i], delta);
    }
}
 
开发者ID:Quanticol,项目名称:CARMA,代码行数:22,代码来源:AbstractLeastSquaresOptimizerAbstractTest.java


示例2: optimize

import org.apache.commons.math3.analysis.differentiation.MultivariateDifferentiableVectorFunction; //导入依赖的package包/类
/**
 * Optimize an objective function.
 * Optimization is considered to be a weighted least-squares minimization.
 * The cost function to be minimized is
 * <code>&sum;weight<sub>i</sub>(objective<sub>i</sub> - target<sub>i</sub>)<sup>2</sup></code>
 *
 * @param f Objective function.
 * @param target Target value for the objective functions at optimum.
 * @param weight Weights for the least squares cost computation.
 * @param startPoint Start point for optimization.
 * @return the point/value pair giving the optimal value for objective
 * function.
 * @param maxEval Maximum number of function evaluations.
 * @throws org.apache.commons.math3.exception.DimensionMismatchException
 * if the start point dimension is wrong.
 * @throws org.apache.commons.math3.exception.TooManyEvaluationsException
 * if the maximal number of evaluations is exceeded.
 * @throws org.apache.commons.math3.exception.NullArgumentException if
 * any argument is {@code null}.
 */
public PointVectorValuePair optimize(final int maxEval,
                                     final MultivariateDifferentiableVectorFunction f,
                                     final double[] target, final double[] weights,
                                     final double[] startPoint) {

    // Reset counter.
    jacobianEvaluations = 0;

    // Store least squares problem characteristics.
    jF = new JacobianFunction(f);

    // Arrays shared with the other private methods.
    point = startPoint.clone();
    rows = target.length;
    cols = point.length;

    weightedResidualJacobian = new double[rows][cols];
    this.weightedResiduals = new double[rows];

    cost = Double.POSITIVE_INFINITY;

    return optimizeInternal(maxEval, f, target, weights, startPoint);
}
 
开发者ID:SpoonLabs,项目名称:astor,代码行数:44,代码来源:AbstractLeastSquaresOptimizer.java


示例3: doTestStRD

import org.apache.commons.math3.analysis.differentiation.MultivariateDifferentiableVectorFunction; //导入依赖的package包/类
public void doTestStRD(final StatisticalReferenceDataset dataset,
    final double errParams, final double errParamsSd) {
    final AbstractLeastSquaresOptimizer optimizer = createOptimizer();
    final double[] w = new double[dataset.getNumObservations()];
    Arrays.fill(w, 1.0);

    final double[][] data = dataset.getData();
    final double[] initial = dataset.getStartingPoint(0);
    final MultivariateDifferentiableVectorFunction problem;
    problem = dataset.getLeastSquaresProblem();
    final PointVectorValuePair optimum;
    optimum = optimizer.optimize(100, problem, data[1], w, initial);

    final double[] actual = optimum.getPoint();
    final double[] actualSig = optimizer.guessParametersErrors();
    for (int i = 0; i < actual.length; i++) {
        double expected = dataset.getParameter(i);
        double delta = FastMath.abs(errParams * expected);
        Assert.assertEquals(dataset.getName() + ", param #" + i,
                            expected, actual[i], delta);
        expected = dataset.getParameterStandardDeviation(i);
        delta = FastMath.abs(errParamsSd * expected);
        Assert.assertEquals(dataset.getName() + ", sd of param #" + i,
                            expected, actualSig[i], delta);
    }
}
 
开发者ID:SpoonLabs,项目名称:astor,代码行数:27,代码来源:AbstractLeastSquaresOptimizerAbstractTest.java


示例4: solve

import org.apache.commons.math3.analysis.differentiation.MultivariateDifferentiableVectorFunction; //导入依赖的package包/类
@Override
public double[] solve(int maxEval,
        MultivariateDifferentiableVectorFunction f, double[] startValue) {
    final double[] zeros = startValue.clone();
    final double[] ones = startValue.clone();
    Arrays.fill(zeros, 0.0);
    Arrays.fill(ones, 1.0);

    return optim.optimize(new MaxEval(maxEval),
            new InitialGuess(startValue), new Target(zeros),
            new Weight(ones), new ModelFunction(f),
            new ModelFunctionJacobian(new JacobianFunction(f))).getPoint();
}
 
开发者ID:choeger,项目名称:jdae,代码行数:14,代码来源:OptimalitySolver.java


示例5: testTrivial

import org.apache.commons.math3.analysis.differentiation.MultivariateDifferentiableVectorFunction; //导入依赖的package包/类
@Test
public void testTrivial() {
    LinearProblem problem =
        new LinearProblem(new double[][] { { 2 } }, new double[] { 3 });
    // TODO: the wrapper around GaussNewtonOptimizer is a temporary hack for
    // version 3.1 of the library. It should be removed when GaussNewtonOptimizer
    // will officialy be declared as implementing MultivariateDifferentiableVectorOptimizer
    MultivariateDifferentiableVectorOptimizer underlyingOptimizer =
            new MultivariateDifferentiableVectorOptimizer() {
        private GaussNewtonOptimizer gn =
                new GaussNewtonOptimizer(true,
                                         new SimpleVectorValueChecker(1.0e-6, 1.0e-6));

        public PointVectorValuePair optimize(int maxEval,
                                             MultivariateDifferentiableVectorFunction f,
                                             double[] target,
                                             double[] weight,
                                             double[] startPoint) {
            return gn.optimize(maxEval, f, target, weight, startPoint);
        }

        public int getMaxEvaluations() {
            return gn.getMaxEvaluations();
        }

        public int getEvaluations() {
            return gn.getEvaluations();
        }

        public ConvergenceChecker<PointVectorValuePair> getConvergenceChecker() {
            return gn.getConvergenceChecker();
        }
    };
    JDKRandomGenerator g = new JDKRandomGenerator();
    g.setSeed(16069223052l);
    RandomVectorGenerator generator =
        new UncorrelatedRandomVectorGenerator(1, new GaussianRandomGenerator(g));
    MultivariateDifferentiableVectorMultiStartOptimizer optimizer =
        new MultivariateDifferentiableVectorMultiStartOptimizer(underlyingOptimizer,
                                                                   10, generator);

    // no optima before first optimization attempt
    try {
        optimizer.getOptima();
        Assert.fail("an exception should have been thrown");
    } catch (MathIllegalStateException ise) {
        // expected
    }
    PointVectorValuePair optimum =
        optimizer.optimize(100, problem, problem.target, new double[] { 1 }, new double[] { 0 });
    Assert.assertEquals(1.5, optimum.getPoint()[0], 1.0e-10);
    Assert.assertEquals(3.0, optimum.getValue()[0], 1.0e-10);
    PointVectorValuePair[] optima = optimizer.getOptima();
    Assert.assertEquals(10, optima.length);
    for (int i = 0; i < optima.length; ++i) {
        Assert.assertEquals(1.5, optima[i].getPoint()[0], 1.0e-10);
        Assert.assertEquals(3.0, optima[i].getValue()[0], 1.0e-10);
    }
    Assert.assertTrue(optimizer.getEvaluations() > 20);
    Assert.assertTrue(optimizer.getEvaluations() < 50);
    Assert.assertEquals(100, optimizer.getMaxEvaluations());
}
 
开发者ID:Quanticol,项目名称:CARMA,代码行数:63,代码来源:MultivariateDifferentiableVectorMultiStartOptimizerTest.java


示例6: optimize

import org.apache.commons.math3.analysis.differentiation.MultivariateDifferentiableVectorFunction; //导入依赖的package包/类
/**
 * Optimize an objective function.
 * Optimization is considered to be a weighted least-squares minimization.
 * The cost function to be minimized is
 * <code>&sum;weight<sub>i</sub>(objective<sub>i</sub> - target<sub>i</sub>)<sup>2</sup></code>
 *
 * @param f Objective function.
 * @param target Target value for the objective functions at optimum.
 * @param weights Weights for the least squares cost computation.
 * @param startPoint Start point for optimization.
 * @return the point/value pair giving the optimal value for objective
 * function.
 * @param maxEval Maximum number of function evaluations.
 * @throws org.apache.commons.math3.exception.DimensionMismatchException
 * if the start point dimension is wrong.
 * @throws org.apache.commons.math3.exception.TooManyEvaluationsException
 * if the maximal number of evaluations is exceeded.
 * @throws org.apache.commons.math3.exception.NullArgumentException if
 * any argument is {@code null}.
 * @deprecated As of 3.1. Please use
 * {@link BaseAbstractMultivariateVectorOptimizer#optimize(int,
 * org.apache.commons.math3.analysis.MultivariateVectorFunction,OptimizationData[])
 * optimize(int,MultivariateDifferentiableVectorFunction,OptimizationData...)}
 * instead.
 */
@Deprecated
public PointVectorValuePair optimize(final int maxEval,
                                     final MultivariateDifferentiableVectorFunction f,
                                     final double[] target, final double[] weights,
                                     final double[] startPoint) {
    return optimizeInternal(maxEval, f,
                            new Target(target),
                            new Weight(weights),
                            new InitialGuess(startPoint));
}
 
开发者ID:biocompibens,项目名称:SME,代码行数:36,代码来源:AbstractLeastSquaresOptimizer.java


示例7: optimizeInternal

import org.apache.commons.math3.analysis.differentiation.MultivariateDifferentiableVectorFunction; //导入依赖的package包/类
/**
 * Optimize an objective function.
 * Optimization is considered to be a weighted least-squares minimization.
 * The cost function to be minimized is
 * <code>&sum;weight<sub>i</sub>(objective<sub>i</sub> - target<sub>i</sub>)<sup>2</sup></code>
 *
 * @param maxEval Allowed number of evaluations of the objective function.
 * @param f Objective function.
 * @param optData Optimization data. The following data will be looked for:
 * <ul>
 *  <li>{@link Target}</li>
 *  <li>{@link Weight}</li>
 *  <li>{@link InitialGuess}</li>
 * </ul>
 * @return the point/value pair giving the optimal value of the objective
 * function.
 * @throws org.apache.commons.math3.exception.TooManyEvaluationsException if
 * the maximal number of evaluations is exceeded.
 * @throws DimensionMismatchException if the target, and weight arguments
 * have inconsistent dimensions.
 * @see BaseAbstractMultivariateVectorOptimizer#optimizeInternal(int,
 * org.apache.commons.math3.analysis.MultivariateVectorFunction,OptimizationData[])
 * @since 3.1
 * @deprecated As of 3.1. Override is necessary only until this class's generic
 * argument is changed to {@code MultivariateDifferentiableVectorFunction}.
 */
@Deprecated
protected PointVectorValuePair optimizeInternal(final int maxEval,
                                                final MultivariateDifferentiableVectorFunction f,
                                                OptimizationData... optData) {
    // XXX Conversion will be removed when the generic argument of the
    // base class becomes "MultivariateDifferentiableVectorFunction".
    return super.optimizeInternal(maxEval, FunctionUtils.toDifferentiableMultivariateVectorFunction(f), optData);
}
 
开发者ID:biocompibens,项目名称:SME,代码行数:35,代码来源:AbstractLeastSquaresOptimizer.java


示例8: optimize

import org.apache.commons.math3.analysis.differentiation.MultivariateDifferentiableVectorFunction; //导入依赖的package包/类
/**
 * Optimize an objective function.
 * Optimization is considered to be a weighted least-squares minimization.
 * The cost function to be minimized is
 * <code>&sum;weight<sub>i</sub>(objective<sub>i</sub> - target<sub>i</sub>)<sup>2</sup></code>
 *
 * @param f Objective function.
 * @param target Target value for the objective functions at optimum.
 * @param weights Weights for the least squares cost computation.
 * @param startPoint Start point for optimization.
 * @return the point/value pair giving the optimal value for objective
 * function.
 * @param maxEval Maximum number of function evaluations.
 * @throws org.apache.commons.math3.exception.DimensionMismatchException
 * if the start point dimension is wrong.
 * @throws org.apache.commons.math3.exception.TooManyEvaluationsException
 * if the maximal number of evaluations is exceeded.
 * @throws org.apache.commons.math3.exception.NullArgumentException if
 * any argument is {@code null}.
 * @deprecated As of 3.1. Please use
 * {@link BaseAbstractMultivariateVectorOptimizer#optimize(int,MultivariateVectorFunction,OptimizationData[])
 * optimize(int,MultivariateDifferentiableVectorFunction,OptimizationData...)}
 * instead.
 */
@Deprecated
public PointVectorValuePair optimize(final int maxEval,
                                     final MultivariateDifferentiableVectorFunction f,
                                     final double[] target, final double[] weights,
                                     final double[] startPoint) {
    return optimizeInternal(maxEval, f,
                            new Target(target),
                            new Weight(weights),
                            new InitialGuess(startPoint));
}
 
开发者ID:SpoonLabs,项目名称:astor,代码行数:35,代码来源:AbstractLeastSquaresOptimizer.java


示例9: optimizeInternal

import org.apache.commons.math3.analysis.differentiation.MultivariateDifferentiableVectorFunction; //导入依赖的package包/类
/**
 * Optimize an objective function.
 * Optimization is considered to be a weighted least-squares minimization.
 * The cost function to be minimized is
 * <code>&sum;weight<sub>i</sub>(objective<sub>i</sub> - target<sub>i</sub>)<sup>2</sup></code>
 *
 * @param maxEval Allowed number of evaluations of the objective function.
 * @param f Objective function.
 * @param optData Optimization data. The following data will be looked for:
 * <ul>
 *  <li>{@link Target}</li>
 *  <li>{@link Weight}</li>
 *  <li>{@link InitialGuess}</li>
 * </ul>
 * @return the point/value pair giving the optimal value of the objective
 * function.
 * @throws org.apache.commons.math3.exception.TooManyEvaluationsException if
 * the maximal number of evaluations is exceeded.
 * @throws DimensionMismatchException if the target, and weight arguments
 * have inconsistent dimensions.
 * @see BaseAbstractMultivariateVectorOptimizer#optimizeInternal(int,MultivariateVectorFunction,OptimizationData[])
 * @since 3.1
 * @deprecated As of 3.1. Override is necessary only until this class's generic
 * argument is changed to {@code MultivariateDifferentiableVectorFunction}.
 */
@Deprecated
protected PointVectorValuePair optimizeInternal(final int maxEval,
                                                final MultivariateDifferentiableVectorFunction f,
                                                OptimizationData... optData) {
    // XXX Conversion will be removed when the generic argument of the
    // base class becomes "MultivariateDifferentiableVectorFunction".
    return super.optimizeInternal(maxEval, FunctionUtils.toDifferentiableMultivariateVectorFunction(f), optData);
}
 
开发者ID:SpoonLabs,项目名称:astor,代码行数:34,代码来源:AbstractLeastSquaresOptimizer.java


示例10: getLeastSquaresProblem

import org.apache.commons.math3.analysis.differentiation.MultivariateDifferentiableVectorFunction; //导入依赖的package包/类
/**
 * Returns the least-squares problem corresponding to fitting the model to
 * the specified data.
 *
 * @return the least-squares problem
 */
public MultivariateDifferentiableVectorFunction getLeastSquaresProblem() {
    return problem;
}
 
开发者ID:Quanticol,项目名称:CARMA,代码行数:10,代码来源:StatisticalReferenceDataset.java



注:本文中的org.apache.commons.math3.analysis.differentiation.MultivariateDifferentiableVectorFunction类示例整理自Github/MSDocs等源码及文档管理平台,相关代码片段筛选自各路编程大神贡献的开源项目,源码版权归原作者所有,传播和使用请参考对应项目的License;未经允许,请勿转载。


鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
上一篇:
Java AdapterViewUtil类代码示例发布时间:2022-05-22
下一篇:
Java ExpressionEvaluatorManager类代码示例发布时间:2022-05-22
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap