DiffSharp: Differentiable Functional Programming
DiffSharp is a tensor library with advanced support for taking derivatives of tensor code using automatic differentiation
DiffSharp is designed for use in machine learning, probabilistic programming, optimization and other domains.
Using DiffSharp, advanced differentives including gradients, Hessians, Jacobians, directional derivatives, and matrix-free Hessian- and Jacobian-vector products are possible. This goes beyond the simple reverse-mode gradients of traditional tensor libraries such as TensorFlow and PyTorch. The full expressive capability of the language including control flow while still preserving the ability to take advanced differentiation compositions. These can use nested forward and reverse AD up to any level, meaning that you can compute exact higher-order derivatives or differentiate functions that are internally making use of differentiation. Please see the API Overview page for a list of available operations.
The library is developed by Atılım Güneş Baydin, Don Syme and other contributors. Please join us!
DiffSharp 1.0 is implemented in F# and the default backend uses the PyTorch C++ implementatio of tensors. It is tested on Linux and Windows.
Current Features and Roadmap
The primary features of DiffSharp 1.0 are:
- Tensor programming model for F#
- Reference backend for correctness testing
- PyTorch backend for CUDA support and highly optimized native tensor operations
- Nested differentiation for tensors, supporting forward and reverse AD, or any combination thereof, up to any level
- Matrix-free Jacobian- and Hessian-vector products
See also our github issues
Please join with us to help us get the API right and ensure model development with DiffSharp is as succinct and clean as possible/
Quick Usage Example
1: 2: 3: 4: 5: 6: 7: 8: 9: 10: 11: 12: 13: 14: 15: 16: |
|
More Info and How to Cite
If you are using DiffSharp, we would be very happy to hear about it! Please get in touch with us using email or raise any issues you might have on GitHub. We also have a Gitter chat room that we follow.
If you would like to cite this library, please use the following information:
Atılım Güneş Baydin, Barak A. Pearlmutter, Alexey Andreyevich Radul, Jeffrey Mark Siskind (2015) Automatic differentiation and machine learning: a survey. arXiv preprint. arXiv:1502.05767 (link) (BibTeX)
union case Tensor.Tensor: primalRaw: Backends.RawTensor -> Tensor
--------------------
type Tensor =
| Tensor of primalRaw: RawTensor
| TensorF of primal: Tensor * derivative: Tensor * nestingTag: uint32
| TensorR of primal: Tensor * derivative: Tensor ref * parentOp: TensorOp * fanout: uint32 ref * nestingTag: uint32
interface IConvertible
interface IEnumerable
interface IEnumerable<Tensor>
interface IEquatable<Tensor>
interface IComparable
override Equals : other:obj -> bool
override GetHashCode : unit -> int
member GetSlice : i0:int -> Tensor
member private GetSlice : bounds:int [,] -> Tensor
member GetSlice : i0:int * i1:int -> Tensor
...
