burn/crates/burn-import
Adrian Müller 6b51b73a5f
Fix ONNX where op for scalar inputs (#2218)
* Fix ONNX where op dim_inference for scalar inputs

* Rewrite ONNX Where codegen to support scalars

* ONNX Where: Add tests for all_scalar inputs

---------

Co-authored-by: Guillaume Lagrange <lagrange.guillaume.1@gmail.com>
2024-09-03 11:17:18 -04:00
..
onnx-tests Fix ONNX where op for scalar inputs (#2218) 2024-09-03 11:17:18 -04:00
pytorch-tests Refactor xtask to use tracel-xtask and refactor CI workflow (#2063) 2024-08-28 15:57:13 -04:00
src Fix ONNX where op for scalar inputs (#2218) 2024-09-03 11:17:18 -04:00
Cargo.toml Bump burn version to 0.15.0 2024-08-27 15:13:40 -04:00
DEVELOPMENT.md Add subtract tensor from scalar for ONNX sub op (#1964) 2024-07-05 13:52:02 -05:00
LICENSE-APACHE Update licenses symlinks (#1613) 2024-04-12 14:43:58 -04:00
LICENSE-MIT Update licenses symlinks (#1613) 2024-04-12 14:43:58 -04:00
README.md [refactor] Move burn crates to their own crates directory (#1336) 2024-02-20 13:57:55 -05:00
SUPPORTED-ONNX-OPS.md Update SUPPORTED-ONNX-OPS.md (#2217) 2024-08-29 14:06:42 -04:00

README.md

Importing Models

The Burn project supports the import of models from various frameworks, emphasizing efficiency and compatibility. Currently, it handles two primary model formats:

  1. ONNX: Facilitates direct import, ensuring the model's performance and structure are maintained.

  2. PyTorch: Enables the loading of PyTorch model weights into Burns native model architecture, ensuring seamless integration.

Contribution

Interested in contributing to burn-import? Check out our development guide for more information.