Home / Class/ FuncTorchTLS Class — pytorch Architecture

FuncTorchTLS Class — pytorch Architecture

Architecture documentation for the FuncTorchTLS class in DynamicLayer.cpp from the pytorch codebase.

Entity Profile

Source Code

aten/src/ATen/functorch/DynamicLayer.cpp lines 83–129

class FuncTorchTLS : public FuncTorchTLSBase {
 public:
  FuncTorchTLS() = default;

  std::unique_ptr<FuncTorchTLSBase> deepcopy() const override {
    auto result = std::make_unique<FuncTorchTLS>();
    result->dynamicLayerStack = dynamicLayerStack;
    return result;
  }

  int64_t checkSupportsSingleLevelAutogradFunction() const override {
    TORCH_INTERNAL_ASSERT(dynamicLayerStack.empty() || getSingleLevelAutogradFunctionAllowed(),
        "functorch functions (vmap, grad, vjp, etc.) incorrectly used with ",
        "torch.autograd.function._SingleLevelFunction. ",
        "This is not expected, please file a bug.");
    return 0;
  }

  void checkSupportsCppAutogradFunction() const override {
    TORCH_CHECK(
        dynamicLayerStack.empty(),
        "cannot use C++ torch::autograd::Function with functorch transforms (vmap, grad, vjp, etc)");
  }

  void checkSupportsInplaceRequiresGrad() const override {
    TORCH_CHECK(dynamicLayerStack.empty() || allow_inplace_requires_grad_,
        "You are attempting to call Tensor.requires_grad_() (or perhaps using ",
        "torch.autograd.functional.* APIs) inside of a function being transformed ",
        "by a functorch transform. ",
        "This is unsupported, please attempt to use the functorch transforms ",
        "(e.g. grad, vjp, jacrev, jacfwd, hessian) or call requires_grad_() "
        "outside of a function being transformed instead.");
  }
  void checkSupportsRetainGrad() const override {
    TORCH_CHECK(dynamicLayerStack.empty(),
        "You are attempting to call Tensor.retain_grad() ",
        "inside of a function being transformed ",
        "by a functorch transform. ",
        "This is unsupported, please attempt to use the functorch transforms ",
        "(e.g. grad, vjp, jacrev, jacfwd, hessian) or call retain_grad() "
        "outside of a function being transformed instead.");
  }

  std::vector<DynamicLayer> dynamicLayerStack;
  bool allow_inplace_requires_grad_ = false;
  bool allow_single_level_autograd_function_ = false;
};

Analyze Your Own Codebase

Get architecture documentation, dependency graphs, and domain analysis for your codebase in minutes.

Try Supermodel Free