在线时间:8:00-16:00
迪恩网络APP
随时随地掌握行业动态
扫描二维码
关注迪恩网络微信公众号
开源软件名称(OpenSource Name):tensorflow/haskell开源软件地址(OpenSource Url):https://github.com/tensorflow/haskell开源编程语言(OpenSource Language):Haskell 97.6%开源软件介绍(OpenSource Introduction):The tensorflow-haskell package provides Haskell bindings to TensorFlow. This is not an official Google product. Documentationhttps://tensorflow.github.io/haskell/haddock/ TensorFlow.Core is a good place to start. ExamplesNeural network model for the MNIST dataset: code Toy example of a linear regression model (full code): import Control.Monad (replicateM, replicateM_)
import System.Random (randomIO)
import Test.HUnit (assertBool)
import qualified TensorFlow.Core as TF
import qualified TensorFlow.GenOps.Core as TF
import qualified TensorFlow.Minimize as TF
import qualified TensorFlow.Ops as TF hiding (initializedVariable)
import qualified TensorFlow.Variable as TF
main :: IO ()
main = do
-- Generate data where `y = x*3 + 8`.
xData <- replicateM 100 randomIO
let yData = [x*3 + 8 | x <- xData]
-- Fit linear regression model.
(w, b) <- fit xData yData
assertBool "w == 3" (abs (3 - w) < 0.001)
assertBool "b == 8" (abs (8 - b) < 0.001)
fit :: [Float] -> [Float] -> IO (Float, Float)
fit xData yData = TF.runSession $ do
-- Create tensorflow constants for x and y.
let x = TF.vector xData
y = TF.vector yData
-- Create scalar variables for slope and intercept.
w <- TF.initializedVariable 0
b <- TF.initializedVariable 0
-- Define the loss function.
let yHat = (x `TF.mul` TF.readValue w) `TF.add` TF.readValue b
loss = TF.square (yHat `TF.sub` y)
-- Optimize with gradient descent.
trainStep <- TF.minimizeWith (TF.gradientDescent 0.001) loss [w, b]
replicateM_ 1000 (TF.run trainStep)
-- Return the learned parameters.
(TF.Scalar w', TF.Scalar b') <- TF.run (TF.readValue w, TF.readValue b)
return (w', b') Installation InstructionsNote: building this repository with Build with Docker on LinuxAs an expedient we use docker for building. Once you have docker working, the following commands will compile and run the tests.
There is also a demo application:
Stack + Docker + GPUIf you want to use GPU you can do:
Using nvidia-docker version 2See Nvidia docker 2 install instructions
Using nvidia-docker classicStack needs to use
Build on macOSRun the install_macos_dependencies.sh
script in the After running the script to install system dependencies, build the project with stack:
Build on NixOSThe
or alternatively you can run
to enter the environment and build the project. Note, that it is an emulation of common Linux environment rather than full-featured Nix package expression. No exportable Nix package will appear, but local development is possible. Installation on CentOSXiaokui Shu (@subbyte) maintains separate instructions for installation on CentOS. Related ProjectsStatically validated tensor shapeshttps://github.com/helq/tensorflow-haskell-deptyped is experimenting with using dependent types to statically validate tensor shapes. May be merged with this repository in the future. Example: {-# LANGUAGE DataKinds, ScopedTypeVariables #-}
import Data.Maybe (fromJust)
import Data.Vector.Sized (Vector, fromList)
import TensorFlow.DepTyped
test :: IO (Vector 8 Float)
test = runSession $ do
(x :: Placeholder "x" '[4,3] Float) <- placeholder
let elems1 = fromJust $ fromList [1,2,3,4,1,2]
elems2 = fromJust $ fromList [5,6,7,8]
(w :: Tensor '[3,2] '[] Build Float) = constant elems1
(b :: Tensor '[4,1] '[] Build Float) = constant elems2
y = (x `matMul` w) `add` b -- y shape: [4,2] (b shape is [4.1] but `add` broadcasts it to [4,2])
let (inputX :: TensorData "x" [4,3] Float) =
encodeTensorData . fromJust $ fromList [1,2,3,4,1,0,7,9,5,3,5,4]
runWithFeeds (feed x inputX :~~ NilFeedList) y
main :: IO ()
main = test >>= print LicenseThis project is licensed under the terms of the Apache 2.0 license. |
2023-10-27
2022-08-15
2022-08-17
2022-09-23
2022-08-13
请发表评论