HomeMachine LearningMachine Learning NewsJAX to surpass NumPy in Machine Learning

JAX to surpass NumPy in Machine Learning

Automatic differentiation can significantly improve the success of a deep learning model by reducing the development time required for iterating over models and experiments. Previously, the model was liable to bugs, since the programmers had to design their own gradients. This was also a tedious task and the result was a disaster. TensorFlow and PyTorch are the libraries used to track gradients over a neural network. These libraries aid in the development of normal functionalities, but they are insufficient for the development of a model that is outside of its scope.

Autograd is a library for automatically differentiating native Python and NumPy code. The improvised version of Autograd is JAX, which can merge hardware acceleration and automatic differentiation with XLA (accelerated linear algebra), a domain-specific linear algebra compiler capable of accelerating tensor flow models. Simply put, DeepMind’s JAX (Just After eXecution) is a recent machine/deep learning library.

Purpose of using JAX

JAX supports automatic differentiation, which is the primary strength of deep learning, and it can also greatly improve speed, which is the reason for JAX being the favorite of most developers. JAX operations are XLA based and this is the reason it can compile at a faster rate than usual, i.e., 7.3 times faster with normal training and 12 times faster eventually.

JAX’S JIT (Just In Time) compilation attribute assists in enhancing its speed additionally through the addition of a simple function decorator. It also aids the developers to minimize redundancy via vectorization to a great extent. Several iterations of the machine learning process are required to model a large number of datasets using a single function.

The automatic vectorization provided by JAX through the vmap transformation enables data parallelism using pmap transformation. JAX is commonly regarded as an alternative deep learning framework; however, its applications extend beyond library functions. Flax, haiku, and elegy are deep learning libraries that are built on top of JAX. Hessians, in particular, perform higher order optimization, which JAX excels at computing, thanks to XLA.

JAX Vs Numpy

JAX is highly compatible with GPUs, and as a result ends up having fundamental compatibility with CPUs, whereas Numpy is only compatible with CPUs.

Because JAX has a Numpy-like API, it can auto-compile code directly on accelerators such as GPUs and TPUs, making the process smooth. This implies that code written in Numpy’s syntax can run without error on both CPUs and GPUs.

Although JAX has specialized constructs, it stands at a lower level with less control than deep learning, making it an ideal replacement for NumPy. Also since it has a bare metal structure, it can be utilized for all types of development other than deep learning.

Overall, JAX can be thought of as an enhanced version of Numpy to perform the preceding functions, with JAX’s numpy version referred to as Jax.numPy, and JAX is close to numpy with the exception that JAX can run code on accelerators.

Most Popular