- Contra (permainan video)
- Warner Bros. Pictures Animation
- JsMath
- MathML
- JAX (software)
- Jax
- MathJax
- The Lifecycle of Software Objects
- Keras
- Ajax
- Apache CXF
- Java Development Kit
- Jakarta XML RPC
- Web Services Invocation Framework
- jax.numpy.matvec — JAX documentation
- jax.lax.split — JAX documentation
- jax.lax.rem — JAX documentation - Read the Docs
- jax.lax.select_n — JAX documentation - Read the Docs
- jax.Array.real — JAX documentation - Read the Docs
- jax.numpy.broadcast_shapes — JAX documentation - Read the …
- jax.numpy.expm1 — JAX documentation - Read the Docs
- jax.Array.var — JAX documentation - Read the Docs
- jax.numpy.iscomplexobj — JAX documentation - Read the Docs
- jax.Array.at — JAX documentation - Read the Docs
jax software
JAX (software) GudangMovies21 Rebahinxxi LK21
JAX is python library that provides a machine learning framework for transforming numerical functions developed by Google with some contributions from Nvidia. It is described as bringing together a modified version of autograd (automatic obtaining of the gradient function through differentiation of a function) and OpenXLA's XLA (Accelerated Linear Algebra). It is designed to follow the structure and workflow of NumPy as closely as possible and works with various existing frameworks such as TensorFlow and PyTorch. The primary functions of JAX are:
grad: automatic differentiation
jit: compilation
vmap: auto-vectorization
pmap: Single program, multiple data (SPMD) programming
grad
The below code demonstrates the grad function's automatic differentiation.
The final line should outputː
jit
The below code demonstrates the jit function's optimization through fusion.
The computation time for jit_cube (line #17) should be noticeably shorter than that for cube (line #16). Increasing the values on line #7, will further exacerbate the difference.
vmap
The below code demonstrates the vmap function's vectorization.
The GIF on the right of this section illustrates the notion of vectorized addition.
pmap
The below code demonstrates the pmap function's parallelization for matrix multiplication.
The final line should print the valuesː
See also
NumPy
TensorFlow
PyTorch
CUDA
External links
Documentationː jax.readthedocs.io
Colab (Jupyter/iPython) Quickstart Guideː colab.research.google.com/github/google/jax/blob/main/docs/notebooks/quickstart.ipynb
TensorFlow's XLAː www.tensorflow.org/xla (Accelerated Linear Algebra)
YouTube TensorFlow Channel "Intro to JAX: Accelerating Machine Learning research": www.youtube.com/watch?v=WdTeDXsOSj4
Original paperː mlsys.org/Conferences/doc/2018/146.pdf
References
Kata Kunci Pencarian: jax software
jax software
Daftar Isi
jax.numpy.matvec — JAX documentation
jax.numpy.matvec# jax.numpy. matvec (x1, x2, /) [source] # Batched matrix-vector product. JAX implementation of numpy.matvec(). Parameters: x1 (ArrayLike) – array of shape (..., M, N) x2 …
jax.lax.split — JAX documentation
Gradient checkpointing with jax.checkpoint (jax.remat) JAX Internals: primitives; JAX internals: The jaxpr language; 🔪 JAX - The Sharp Bits 🔪; Frequently asked questions (FAQ) More …
jax.lax.rem — JAX documentation - Read the Docs
jax.lax. rem (x, y) [source] # Elementwise remainder: \(x \bmod y\) . The sign of the result is taken from the dividend, and the absolute value of the result is always less than the divisor’s …
jax.lax.select_n — JAX documentation - Read the Docs
jax.lax.select_n# jax.lax. select_n (which, * cases) [source] # Selects array values from multiple cases. Generalizes XLA’s Select operator. Unlike XLA’s version, the operator is variadic and …
jax.Array.real — JAX documentation - Read the Docs
Control autodiff’s saved values with jax.checkpoint (aka jax.remat) Generalized convolutions in JAX; List of XLA compiler flags; Developer notes. Contributing to JAX; Building from source; …
jax.numpy.broadcast_shapes — JAX documentation - Read the …
jax.numpy.broadcast_shapes# jax.numpy. broadcast_shapes (* shapes) [source] # Broadcast input shapes to a common output shape. JAX implementation of numpy.broadcast_shapes(). …
jax.numpy.expm1 — JAX documentation - Read the Docs
jax.numpy.expm1# jax.numpy. expm1 (x, /) [source] # Calculate exp(x)-1 of each element of the input. JAX implementation of numpy.expm1. Parameters: x (ArrayLike) – input array or scalar. …
jax.Array.var — JAX documentation - Read the Docs
jax.Array.var# abstract Array. var ( axis = None , dtype = None , out = None , ddof = 0 , keepdims = False , * , where = None , correction = None ) [source] # Compute the variance along a …
jax.numpy.iscomplexobj — JAX documentation - Read the Docs
JAX implementation of numpy.iscomplexobj(). The function evaluates based on input type rather than value. Inputs with zero imaginary parts are still considered complex.
jax.Array.at — JAX documentation - Read the Docs
jax.Array.at# abstract property Array. at [source] # Helper property for index update functionality. The at property provides a functionally pure equivalent of in-place array modifications. In …