This is a repository for burn image models; inspired by the Python timm package.
The future feature list is:
timm=>bimm
See the CONTRIBUTING guide for build and contribution instructions.
bimm - the main crate for image models.
- bimm::cache - weight loading cache.
- bimm::layers - reusable neural network modules.
- bimm::layers::activation - activation layers.
- bimm::layers::activation::Activation
- activation layer abstraction wrapper.
- bimm::layers::activation::Activation
- bimm::layers::blocks - miscellaneous blocks.
- bimm::layers::blocks::conv_norm -
Conv2d + BatchNorm2dblock.
- bimm::layers::blocks::conv_norm -
- bimm::layers::drop - dropout layers.
- bimm::layers::drop::drop_block - 2d drop block / spatial dropout.
- bimm::layers::drop::drop_path - drop path / stochastic depth.
- bimm::layers::patching - patching layers.
- bimm::layers::patching::patch_embed - 2d patch embedding layer.
- bimm::layers::activation - activation layers.
- bimm::models - complete model families.
- bimm::models::resnet -
ResNet - bimm::models::swin - The SWIN Family.
- bimm::models::swin::v2 - The SWIN-V2 Model.
- bimm::models::resnet -
Example of building a pretrained model:
use burn::backend::Wgpu;
use bimm::cache::disk::DiskCacheConfig;
use bimm::models::resnet::{PREFAB_RESNET_MAP, ResNet};
let device = Default::default();
let prefab = PREFAB_RESNET_MAP.expect_lookup_prefab("resnet18");
let weights = prefab
.expect_lookup_pretrained_weights("tv_in1k")
.fetch_weights(&DiskCacheConfig::default())
.expect("Failed to fetch weights");
let model: ResNet<Wgpu> = prefab
.to_config()
.to_structure()
.init(&device)
.load_pytorch_weights(weights)
.expect("Failed to load weights")
// re-head the model to 10 classes:
.with_classes(10)
// Enable (drop_block_prob) stochastic block drops for training:
.with_stochastic_drop_block(0.2)
// Enable (drop_path_prob) stochastic depth for training:
.with_stochastic_path_depth(0.1);bimm-contracts - a crate for static shape contracts for tensors.
This crate is now hosted in its own repository: bimm-contracts
This crate provides a stand-alone library for defining and enforcing tensor shape contracts in-line with the Burn framework modules and methods.
use bimm_contracts::{unpack_shape_contract, shape_contract, run_periodically};
pub fn window_partition<B: Backend, K>(
tensor: Tensor<B, 4, K>,
window_size: usize,
) -> Tensor<B, 4, K>
where
K: BasicOps<B>,
{
let [b, h_wins, w_wins, c] = unpack_shape_contract!(
[
"batch",
"height" = "h_wins" * "window_size",
"width" = "w_wins" * "window_size",
"channels"
],
&tensor,
&["batch", "h_wins", "w_wins", "channels"],
&[("window_size", window_size)],
);
let tensor = tensor
.reshape([b, h_wins, window_size, w_wins, window_size, c])
.swap_dims(2, 3)
.reshape([b * h_wins * w_wins, window_size, window_size, c]);
// Run an amortized check on the output shape.
//
// `run_periodically!{}` runs the first 10 times,
// then on an incrementally lengthening schedule,
// until it reaches its default period of 1000.
//
// Due to amortization, in release builds, this averages ~4ns:
assert_shape_contract_periodically!(
[
"batch" * "h_wins" * "w_wins",
"window_size",
"window_size",
"channels"
],
&tensor,
&[
("batch", b),
("h_wins", h_wins),
("w_wins", w_wins),
("window_size", window_size),
("channels", c),
]
);
tensor
}bimm-firehose - a data loading and augmentation framework.
This crate provides a SQL-inspired table + operations framework for modular data pipeline construction.
It's still very much a work in progress, and any issues/design bugs reported are very appreciated.
This crate provides a set of image-specific operations for bimm-firehose.
Add-on crates: