Stop writing boilerplate. Start training.
Friendly Environment for Neural Networks (fenn) is a simple framework that automates ML/DL workflows by providing prebuilt trainers, templates, logging, configuration management, and much more. With fenn, you can focus on your model and data while it takes care of the rest.
If fenn is useful for your work or research, consider supporting its development.
You can support the project by starring the repository on GitHub. It improves visibility and helps others discover fenn.
Sponsorship also helps fund maintenance, improvements, and new features.
Support the project: https://github.com/sponsors/blkdmr
-
Auto-Configuration: YAML files are automatically parsed and injected into your entrypoint with CLI override support. No more hardcoded hyperparameters or scattered config logic.
-
Unified Logging: All logs, print statements, and experiment metadata are automatically captured to local files and remote tracking backends simultaneously with no manual setup required.
-
Backend Monitoring: Native integration with industry-standard trackers like Weights & Biases (W&B) for centralized experiment tracking and TensorBoard for real-time metric visualization
-
Instant Notifications: Get real-time alerts on Discord and Telegram when experiments start, complete, or fail—no polling or manual checks.
-
Trainers: Built-in support for training loops, validation, and testing with minimal boilerplate. Just define your model and data, and let fenn handle the rest.
-
Template Ready: Built-in support for reproducible, shareable experiment templates.
pip install fennRun the CLI tool to see which repositories are available and to download a template together with its configuration file. First, list the available repositories:
fenn listThen, download one of the available templates (here empty is just an example):
fenn pull emptyThis command downloads the selected template into the current directory and generates the corresponding configuration file, which can be customized before running or extending the project.
fenn relies on a simple YAML structure to define hyperparameters, paths, logging options, and integrations. You can configure the fenn.yaml file with the hyperparameters and options for your project.
The structure of the fenn.yaml file is:
# ---------------------------------------
# Fenn Configuration (Modify Carefully)
# ---------------------------------------
project: empty
# ---------------------------
# Logging & Tracking
# ---------------------------
logger:
dir: logger
# ---------------------------------------
# Example of User Section
# ---------------------------------------
train:
lr: 0.001Use the @app.entrypoint decorator. Your configuration variables are automatically passed via args.
from fenn import Fenn
app = Fenn()
@app.entrypoint
def main(args):
# 'args' contains your fenn.yaml configurations
print(f"Training with learning rate: {args['train']['lr']}")
# Your logic here...
if __name__ == "__main__":
app.run()By default, fenn will look for a configuration file named fenn.yaml in the current directory. If you would like to use a different name, a different location, or have multiple configuration files for different configurations, you can call set_config_file() and update the path or the name of your configuration file. You must assign the filename before calling run().
app = Fenn()
app.set_config_file("my_file.yaml")
...
app.run()python main.pyContributions are welcome!
Interested in contributing? Join the community on Discord
We can then discuss a possible contribution together, answer any questions, and help you get started!
Please, before opening a pull request, consult our CONTRIBUTING.md
The development and long-term direction of fenn is guided by the following maintainers:
| Maintainer | Role |
|---|---|
| @blkdmr | Creator & Project Administrator |
| @giuliaOddi | Project Administrator |
| @GlowCheese | Core Maintainer |
| @franciscolima05 | Core Maintainer |
Maintainers oversee the project roadmap, review pull requests, coordinate releases, and ensure the long-term stability and quality of the framework.
Thank you for supporting the project.

