在线时间:8:00-16:00
迪恩网络APP
随时随地掌握行业动态
扫描二维码
关注迪恩网络微信公众号
开源软件名称:jbrea/BayesianOptimization.jl开源软件地址:https://github.com/jbrea/BayesianOptimization.jl开源编程语言:Julia 100.0%开源软件介绍:BayesianOptimizationUsageusing BayesianOptimization, GaussianProcesses, Distributions
f(x) = sum((x .- 1).^2) + randn() # noisy function to minimize
# Choose as a model an elastic GP with input dimensions 2.
# The GP is called elastic, because data can be appended efficiently.
model = ElasticGPE(2, # 2 input dimensions
mean = MeanConst(0.),
kernel = SEArd([0., 0.], 5.),
logNoise = 0.,
capacity = 3000) # the initial capacity of the GP is 3000 samples.
set_priors!(model.mean, [Normal(1, 2)])
# Optimize the hyperparameters of the GP using maximum a posteriori (MAP) estimates every 50 steps
modeloptimizer = MAPGPOptimizer(every = 50, noisebounds = [-4, 3], # bounds of the logNoise
kernbounds = [[-1, -1, 0], [4, 4, 10]], # bounds of the 3 parameters GaussianProcesses.get_param_names(model.kernel)
maxeval = 40)
opt = BOpt(f,
model,
UpperConfidenceBound(), # type of acquisition
modeloptimizer,
[-5., -5.], [5., 5.], # lowerbounds, upperbounds
repetitions = 5, # evaluate the function for each input 5 times
maxiterations = 100, # evaluate at 100 input positions
sense = Min, # minimize the function
acquisitionoptions = (method = :LD_LBFGS, # run optimization of acquisition function with NLopts :LD_LBFGS method
restarts = 5, # run the NLopt method from 5 random initial conditions each time.
maxtime = 0.1, # run the NLopt method for at most 0.1 second each time
maxeval = 1000), # run the NLopt methods for at most 1000 iterations (for other options see https://github.com/JuliaOpt/NLopt.jl)
verbosity = Progress)
result = boptimize!(opt) Resume optimizationTo continue the optimization, one can call result = boptimize!(opt) # first time (includes initialization)
result = boptimize!(opt) # restart
maxiterations!(opt, 50) # set maxiterations for the next call
result = boptimize!(opt) # restart again (Warm-)start with some known function valuesBy default, the first x = [rand(2) for _ in 1:20]
y = -f.(x)
append!(model, hcat(x...), y)
opt = BOpt(f,
model,
UpperConfidenceBound(),
modeloptimizer,
[-5., -5.], [5., 5.],
maxiterations = 100,
sense = Min,
initializer_iterations = 0
)
result = boptimize!(opt) This package exports
Use the REPL help, e.g. Review papers on Bayesian optimization
Similar ProjectsBayesOpt is a wrapper of the established BayesOpt toolbox written in C++. Dragonfly is a feature-rich package for scalable Bayesian optimization written in Python. Use it in Julia with PyCall. |
2023-10-27
2022-08-15
2022-08-17
2022-09-23
2022-08-13
请发表评论