Search code examples
machine-learningcomputer-visionautogradjaxgoogle-jax

How to use grad convolution in google-jax?


Thanks for reading my question!

I was just learning about custom grad functions in Jax, and I found the approach JAX took with defining custom functions is quite elegant.

One thing troubles me though.

I created a wrapper to make lax convolution look like PyTorch conv2d.

from jax import numpy as jnp
from jax.random import PRNGKey, normal 
from jax import lax
from torch.nn.modules.utils import _ntuple
import jax
from jax.nn.initializers import normal
from jax import grad

torch_dims = {0: ('NC', 'OI', 'NC'), 1: ('NCH', 'OIH', 'NCH'), 2: ('NCHW', 'OIHW', 'NCHW'), 3: ('NCHWD', 'OIHWD', 'NCHWD')}

def conv(input, weight, bias=None, stride=1, padding=0, dilation=1, groups=1):
    n = len(input.shape) - 2
    if type(stride) == int:
        stride = _ntuple(n)(stride)
    if type(padding) == int: 
        padding = [(i, i) for i in _ntuple(n)(padding)]
    if type(dilation) == int:
        dilation = _ntuple(n)(dilation)
    return lax.conv_general_dilated(lhs=input, rhs=weight, window_strides=stride, padding=padding, lhs_dilation=dilation, rhs_dilation=None, dimension_numbers=torch_dims[n], feature_group_count=1, batch_group_count=1, precision=None, preferred_element_type=None)

The problem is that I could not find a way to use its grad function:


init = normal()
rng = PRNGKey(42)
x = init(rng, [128, 3, 224, 224])
k = init(rng, [64, 3, 3, 3])
y = conv(x, k)
grad(conv)(y, k)

This is what I got.

ValueError: conv_general_dilated lhs feature dimension size divided by feature_group_count must equal the rhs input feature dimension size, but 64 // 1 != 3.

Please help!


Solution

  • When I run your code with the most recent releases of jax and jaxlib (jax==0.2.22; jaxlib==0.1.72), I see the following error:

    TypeError: Gradient only defined for scalar-output functions. Output had shape: (128, 64, 222, 222).
    

    If I create a scalar-output function that uses conv, the gradient seems to work:

    result = grad(lambda x, k: conv(x, k).sum())(x, k)
    print(result.shape)
    # (128, 3, 224, 224)
    

    If you are using an older version of JAX, you might try updating to a more recent version – perhaps the error you're seeing is due to a bug that has already been fixed.