• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    迪恩网络公众号

jbrea/BayesianOptimization.jl: Bayesian optimization for Julia

原作者: [db:作者] 来自: 网络 收藏 邀请

开源软件名称:

jbrea/BayesianOptimization.jl

开源软件地址:

https://github.com/jbrea/BayesianOptimization.jl

开源编程语言:

Julia 100.0%

开源软件介绍:

BayesianOptimization

Lifecycle Build Status Build status codecov.io

Usage

using BayesianOptimization, GaussianProcesses, Distributions

f(x) = sum((x .- 1).^2) + randn()                # noisy function to minimize

# Choose as a model an elastic GP with input dimensions 2.
# The GP is called elastic, because data can be appended efficiently.
model = ElasticGPE(2,                            # 2 input dimensions
                   mean = MeanConst(0.),         
                   kernel = SEArd([0., 0.], 5.),
                   logNoise = 0.,
                   capacity = 3000)              # the initial capacity of the GP is 3000 samples.
set_priors!(model.mean, [Normal(1, 2)])

# Optimize the hyperparameters of the GP using maximum a posteriori (MAP) estimates every 50 steps
modeloptimizer = MAPGPOptimizer(every = 50, noisebounds = [-4, 3],       # bounds of the logNoise
                                kernbounds = [[-1, -1, 0], [4, 4, 10]],  # bounds of the 3 parameters GaussianProcesses.get_param_names(model.kernel)
                                maxeval = 40)
opt = BOpt(f,
           model,
           UpperConfidenceBound(),                   # type of acquisition
           modeloptimizer,                        
           [-5., -5.], [5., 5.],                     # lowerbounds, upperbounds         
           repetitions = 5,                          # evaluate the function for each input 5 times
           maxiterations = 100,                      # evaluate at 100 input positions
           sense = Min,                              # minimize the function
           acquisitionoptions = (method = :LD_LBFGS, # run optimization of acquisition function with NLopts :LD_LBFGS method
                                 restarts = 5,       # run the NLopt method from 5 random initial conditions each time.
                                 maxtime = 0.1,      # run the NLopt method for at most 0.1 second each time
                                 maxeval = 1000),    # run the NLopt methods for at most 1000 iterations (for other options see https://github.com/JuliaOpt/NLopt.jl)
            verbosity = Progress)

result = boptimize!(opt)

Resume optimization

To continue the optimization, one can call boptimize!(opt) multiple times.

result = boptimize!(opt) # first time (includes initialization)
result = boptimize!(opt) # restart
maxiterations!(opt, 50)  # set maxiterations for the next call
result = boptimize!(opt) # restart again

(Warm-)start with some known function values

By default, the first 5*length(lowerbounds) input points are sampled from a Sobol sequence. If instead one has already some function values available and wants to skip the initialization with the Sobol sequence, one can update the model with the available data and set initializer_iterations = 0. For example (continuing the above example after setting the modeloptimizer).

x = [rand(2) for _ in 1:20]
y = -f.(x)
append!(model, hcat(x...), y)

opt = BOpt(f,
           model,
           UpperConfidenceBound(),
           modeloptimizer,                        
           [-5., -5.], [5., 5.],
           maxiterations = 100,
           sense = Min,
           initializer_iterations = 0
          )

result = boptimize!(opt)

This package exports

  • BOpt, boptimize!
  • acquisition types: ExpectedImprovement, ProbabilityOfImprovement, UpperConfidenceBound, ThompsonSamplingSimple, MutualInformation
  • scaling of standard deviation in UpperConfidenceBound: BrochuBetaScaling, NoBetaScaling
  • GP hyperparameter optimizer: MAPGPOptimizer, NoModelOptimizer
  • Initializer: ScaledSobolIterator, ScaledLHSIterator
  • optimization sense: Min, Max
  • verbosity levels: Silent, Timings, Progress
  • helper: maxduration!, maxiterations!

Use the REPL help, e.g. ?Bopt, to get more information.

Review papers on Bayesian optimization

Similar Projects

BayesOpt is a wrapper of the established BayesOpt toolbox written in C++.

Dragonfly is a feature-rich package for scalable Bayesian optimization written in Python. Use it in Julia with PyCall.




鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap