Toy implementations of popular deep learning optimizers from scratch in JAX
-
Updated
Jun 20, 2021 - Jupyter Notebook
Toy implementations of popular deep learning optimizers from scratch in JAX
This is an implementation of linear regression using JAX and a functional programming approach. You can find a related blog post at the following link: https://medium.com/@sahinadirhan/simple-linear-regression-using-jax-5ef2eefb8cf4
A JAX implementation of DEformer for arbitrary conditioning.
A comprehensive programming environment designed to facilitate research and development at the intersection of high-energy physics and machine learning.
This tutorial is part of the "Intensive Course Machine Learning: From Basics to Advanced Concepts" of the QuCoLiMa research network and the International Max Planck Research School Physics of Light.
A generalized implementation of Grad-CAM for Flax
minimal forward-mode automatic differentiation using python's abstract syntax tree
Machine learning samples
Python code for "Probabilistic Machine learning" book by Kevin Murphy
Searching for galaxy satellites by their impact on lens potential in strong gravitational lensing setup
This is the official repository of the paper "RoCourseNet: Distributionally Robust Training of a Prediction Aware Recourse Model".
NXML is an eXtension for Machine Learning
implicit differentiation with jax
Add a description, image, and links to the jax topic page so that developers can more easily learn about it.
To associate your repository with the jax topic, visit your repo's landing page and select "manage topics."