Info-icon.png We have moved to https://openmodeldb.info/. This new site has a tag and search system, which will make finding the right models for you much easier! If you have any questions, ask here: https://discord.gg/cpAUpDK

Difference between revisions of "Maintained BasicSR Forks"

From Upscale Wiki
Jump to navigation Jump to search
m (missing ===)
 
(7 intermediate revisions by 4 users not shown)
Line 1: Line 1:
 
== Currently Maintained BasicSR Forks ==
 
== Currently Maintained BasicSR Forks ==
  
=== [https://github.com/BlueAmulet/BasicSR BlueAmulet's Fork] ===
+
=== [https://github.com/muslll/neosr NeoSR] ===
This is currently the most recommended BasicSR fork for most users. It contains many added features and bug fixes along with all of Victorca's useful original additions. Added features include auto-backups, YAML training configs, and Frequency Separation.
+
The latest Fork by musl, it has AMP training support, and an opinionated choice of losses and architectures integrated to avoid those losses and architectures that don't add a meaningful benefit to the training process. It supports all of the relevant modern architectures and as such is the recommended fork as of September 2023.
 +
 
 +
=== [https://github.com/FlotingDream/traiNNer-redux-FJ traiNNer-redux FJ] ===
 +
A fork of Joey's fork, adding additional architectures, ...
 +
 
 +
=== [https://github.com/joeyballentine/traiNNer-redux traiNNer-redux] ===
 +
A fork of BasicSR made by Joey to add some additional losses such as the CX Loss and Color loss, as well as adding a few additional architectures such as swinirgan, 2c2esrgan, RealSwinIRGAN, compact-2c2, OmniSR and a few augmentations.
  
=== [https://github.com/DinJerr/BasicSR DinJerr's Fork] ===
+
=== [https://github.com/xinntao/BasicSR Xinntao's Original BasicSR] ===
This fork has lots of cool [[On The Fly training]] options added through ImageMagick. This includes different types of dithering and a Kuwahara filter. It also is currently the only BasicSR fork that supports AMP (Automatic Mixed Precision).
+
This is the original BasicSR repository. It was updated to v1 officially, however, it is not recommended as it lacks many of the features the community generally finds useful, and only creates x4 scale [[ESRGAN new-arch|new-arch models]]
  
=== [https://github.com/xinntao/BasicSR Xinntao's Original Repo] ===
+
=== [https://github.com/styler00dollar/Colab-traiNNer Colab-traiNNer] ===
This is the original BasicSR repository. It was recently updated to v1 officially, however, it is not recommended as it lacks many of the features the community generally finds useful, and only creates x4 scale [[ESRGAN new-arch|new-arch models]]
+
Sudo's fork that contains a ton of architectures and losses, you can see him experimenting and expanding it on our discord.
  
 
== Dead BasicSR Forks ==
 
== Dead BasicSR Forks ==
  
=== [https://github.com/victorca25/BasicSR Victorca25's Fork] ===
+
=== [https://github.com/victorca25/BasicSR Victorca25's traiNNer] ===
This fork blessed us with [[On The Fly training]] and many other neat features. However, it is very buggy and no longer being maintained. Therefore, it is not recommended to be used any longer. Both BlueAmulet's and Dinjerr's forks are technically forks of this fork.
+
This used to be the recommended fork, It includes many features including contextual loss, enhanced degredation pipelines (augmentations), and support for many architectures (SOFVSR, for example).
 +
 
 +
=== [https://github.com/BlueAmulet/BasicSR BlueAmulet's Fork] ===
 +
This fork added many features like auto-backups, YAML training configs, and Frequency Separation.
 +
 
 +
It is no longer maintained.
 +
 
 +
=== [https://github.com/DinJerr/BasicSR DinJerr's Fork] ===
 +
This fork has lots of cool [[On The Fly training]] options added through ImageMagick. This includes different types of dithering and a Kuwahara filter.  
 +
 
 +
It seems to be unmaintained as of late 2021.

Latest revision as of 08:35, 17 September 2023

Currently Maintained BasicSR Forks

NeoSR

The latest Fork by musl, it has AMP training support, and an opinionated choice of losses and architectures integrated to avoid those losses and architectures that don't add a meaningful benefit to the training process. It supports all of the relevant modern architectures and as such is the recommended fork as of September 2023.

traiNNer-redux FJ

A fork of Joey's fork, adding additional architectures, ...

traiNNer-redux

A fork of BasicSR made by Joey to add some additional losses such as the CX Loss and Color loss, as well as adding a few additional architectures such as swinirgan, 2c2esrgan, RealSwinIRGAN, compact-2c2, OmniSR and a few augmentations.

Xinntao's Original BasicSR

This is the original BasicSR repository. It was updated to v1 officially, however, it is not recommended as it lacks many of the features the community generally finds useful, and only creates x4 scale new-arch models

Colab-traiNNer

Sudo's fork that contains a ton of architectures and losses, you can see him experimenting and expanding it on our discord.

Dead BasicSR Forks

Victorca25's traiNNer

This used to be the recommended fork, It includes many features including contextual loss, enhanced degredation pipelines (augmentations), and support for many architectures (SOFVSR, for example).

BlueAmulet's Fork

This fork added many features like auto-backups, YAML training configs, and Frequency Separation.

It is no longer maintained.

DinJerr's Fork

This fork has lots of cool On The Fly training options added through ImageMagick. This includes different types of dithering and a Kuwahara filter.

It seems to be unmaintained as of late 2021.