Info-icon.png We have moved to https://openmodeldb.info/. This new site has a tag and search system, which will make finding the right models for you much easier! If you have any questions, ask here: https://discord.gg/cpAUpDK

Maintained BasicSR Forks

From Upscale Wiki
This is the latest revision of this page; it has no approved revision.
Revision as of 08:35, 17 September 2023 by Alsa (talk | contribs) (missing ===)
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)
Jump to navigation Jump to search

Currently Maintained BasicSR Forks

NeoSR

The latest Fork by musl, it has AMP training support, and an opinionated choice of losses and architectures integrated to avoid those losses and architectures that don't add a meaningful benefit to the training process. It supports all of the relevant modern architectures and as such is the recommended fork as of September 2023.

traiNNer-redux FJ

A fork of Joey's fork, adding additional architectures, ...

traiNNer-redux

A fork of BasicSR made by Joey to add some additional losses such as the CX Loss and Color loss, as well as adding a few additional architectures such as swinirgan, 2c2esrgan, RealSwinIRGAN, compact-2c2, OmniSR and a few augmentations.

Xinntao's Original BasicSR

This is the original BasicSR repository. It was updated to v1 officially, however, it is not recommended as it lacks many of the features the community generally finds useful, and only creates x4 scale new-arch models

Colab-traiNNer

Sudo's fork that contains a ton of architectures and losses, you can see him experimenting and expanding it on our discord.

Dead BasicSR Forks

Victorca25's traiNNer

This used to be the recommended fork, It includes many features including contextual loss, enhanced degredation pipelines (augmentations), and support for many architectures (SOFVSR, for example).

BlueAmulet's Fork

This fork added many features like auto-backups, YAML training configs, and Frequency Separation.

It is no longer maintained.

DinJerr's Fork

This fork has lots of cool On The Fly training options added through ImageMagick. This includes different types of dithering and a Kuwahara filter.

It seems to be unmaintained as of late 2021.