• 设为首页
  • 点击收藏
  • 手机版
    手机扫一扫访问
    迪恩网络手机版
  • 关注官方公众号
    微信扫一扫关注
    迪恩网络公众号

JuliaML/LossFunctions.jl: Julia package of loss functions for machine learning.

原作者: [db:作者] 来自: 网络 收藏 邀请

开源软件名称:

JuliaML/LossFunctions.jl

开源软件地址:

https://github.com/JuliaML/LossFunctions.jl

开源编程语言:

Julia 100.0%

开源软件介绍:

LossFunctions

LossFunctions.jl is a Julia package that provides efficient and well-tested implementations for a diverse set of loss functions that are commonly used in Machine Learning.

License Docs Build Status Coverage Status

Available Losses

Distance-based (Regression) Margin-based (Classification)
distance_losses margin_losses

Please consult the documentation for other losses.

Introduction

Typically, the loss functions we work with in Machine Learning fall into the category of supervised losses. These are multivariate functions of two variables, the true target y, which represents the "ground truth" (i.e. correct answer), and the predicted output ŷ, which is what our model thinks the truth is. A supervised loss function takes these two variables as input and returns a value that quantifies how "bad" our prediction is in comparison to the truth. In other words: the lower the loss, the better the prediction.

This package provides a considerable amount of carefully implemented loss functions, as well as an API to query their properties (e.g. convexity). Furthermore, we expose methods to compute their values, derivatives, and second derivatives for single observations as well as arbitrarily sized arrays of observations. In the case of arrays a user additionally has the ability to define if and how element-wise results are averaged or summed over.

Documentation

Check out the latest documentation

Additionally, you can make use of Julia's native docsystem. The following example shows how to get additional information on HingeLoss within Julia's REPL:

?HingeLoss
search: HingeLoss L2HingeLoss L1HingeLoss SmoothedL1HingeLoss

  L1HingeLoss <: MarginLoss

  The hinge loss linearly penalizes every predicition where the
  resulting agreement a = y⋅ŷ < 1 . It is Lipschitz continuous
  and convex, but not strictly convex.

  L(a) = \max \{ 0, 1 - a \}

  --------------------------------------------------------------------

                Lossfunction                     Derivative
        ┌────────────┬────────────┐      ┌────────────┬────────────┐
      3 │'\.                      │    0 │                  ┌------│
        │  ''_                    │      │                  |      │
        │     \.                  │      │                  |      │
        │       '.                │      │                  |      │
      L │         ''_             │   L' │                  |      │
        │            \.           │      │                  |      │
        │              '.         │      │                  |      │
      0 │                ''_______│   -1 │------------------┘      │
        └────────────┴────────────┘      └────────────┴────────────┘
        -2                        2      -2                        2
                   y ⋅ ŷ                            y ⋅ ŷ

Installation

Get the latest stable release with Julia's package manager:

] add LossFunctions

License

This code is free to use under the terms of the MIT license.




鲜花

握手

雷人

路过

鸡蛋
该文章已有0人参与评论

请发表评论

全部评论

专题导读
热门推荐
阅读排行榜

扫描微信二维码

查看手机版网站

随时了解更新最新资讯

139-2527-9053

在线客服(服务时间 9:00~18:00)

在线QQ客服
地址:深圳市南山区西丽大学城创智工业园
电邮:jeky_zhao#qq.com
移动电话:139-2527-9053

Powered by 互联科技 X3.4© 2001-2213 极客世界.|Sitemap