Julia is a high-level, dynamic programming language which is fast, flexible, easy-to-use, scalable, and supports high-speed mathematical computation. The programming language also supports all hardware, including GPUs and TPUs on every cloud. Julia uses multiple dispatches as a paradigm, making it easy to express many object-oriented and functional programming patterns. In one of our articles, we discussed how this language is making AI and machine learning better.
In this article, we list down top 9 machine learning frameworks in Julia, one must know.
(The libraries are listed according to their stars on GitHub)
1| Flux
About: Flux is deep learning and machine learning library that provides a single, intuitive way to define models, just like mathematical notation. Any existing Julia libraries are differentiable and can be incorporated directly into Flux models. The intuitive features include compiled eager code, differentiable programming, GPU support, ONNX, among others.
Click here to know more.
2| Mocha.jl
About: Mocha.jl is a deep learning library for the Julia programming language, which includes a number of features mentioned below:
- Written in Julia and for Julia: Mocha.jl is completely written in Julia. This means that the library has native Julia interfaces and is capable of interacting with core Julia functionality as well as other Julia packages.
- Minimum dependencies: This library includes minimum dependencies to use Julia as backend, and there is zero need for root privileges or installation of any external dependencies.
- Multiple backends: This library comes with a GPU backend, combining customised kernels with highly efficient libraries from NVIDIA such as cuBLAS, cuDNN, etc.
- Modularity and correctness: This library is implemented in a modular architecture.
Click here to know more.
3| Knet
About: Knet is a deep learning framework implemented in the Julia programming language. Knet allows models to be defined by just describing their forward computation in plain Julia, allowing the use of loops, conditionals, recursion, closures, tuples, dictionaries, array indexing, concatenation and other high-level language features. The library supports GPU operations and automates differentiation using dynamic computational graphs for models defined in plain Julia.
Click here to know more.
4| TensorFlow.jl
About: TensorFlow.jl is also a Julia wrapper for popular open-source machine learning TensorFlow. This wrapper can be used for various purposes such as fast ingestion of data, especially data in uncommon formats, fast postprocessing of inference results, such as calculating various statistics and visualisations that do not have a canned vectorized implementation.
Click here to know more.
5| ScikitLearn.jl
About: ScikitLearn.jl is a Julia wrapper for the popular Python library Scikit-learn. It implements the Scikit-learn interface and algorithms in Julia. It provides a uniform interface for training and using models, as well as a set of tools for chaining (pipelines), evaluating, and tuning model hyperparameters. It supports both models from the Julia ecosystem and those of the Scikit-learn library.
Click here to know more.
6| MXNet.jl
About: MXNet.jl is the Apache MXNet Julia package that brings flexible and efficient GPU computing and state-of-art deep learning to Julia. The features of this library include efficient tensor and matrix computation across multiple devices, including multiple CPUs, GPUs and distributed server nodes. It also has flexible symbolic manipulation to composite and construction of state-of-the-art deep learning models.
Click here to know more.
7| MLBase.jl
About: MLBase.jl is a Julia package that provides useful tools for machine learning applications. It provides a collection of useful tools to support machine learning programs, including data manipulation and preprocessing, score-based classification, performance evaluation, cross-validation and model tuning.
Click here to know more.
8| Merlin
About: Merlin is a deep learning framework written in Julia. The library aims to provide a fast, flexible and compact deep learning library for machine learning. The requirements of this library are Julia 0.6 and g++ for OSX or Linux. The library runs on CPUs and CUDA GPUs.
Click here to know more.
9| Strada
About: Strada is an open-source deep learning library for Julia, based on the popular Caffe framework. The library supports convolutional and recurrent neural network training, both on CPUs and GPUs. Some of the features of this library include flexibility, support for Caffe features, integration with Julia and other such.
Click here to know more.