{% extends "!layout.html" %} {% set title = "Welcome To Neural Network Intelligence !!!"%} {% block document %}
简体中文
NNI (Neural Network Intelligence) is a lightweight but powerful toolkit to help users automate Feature Engineering, Neural Architecture Search, Hyperparameter Tuning and Model Compression.

The tool manages automated machine learning (AutoML) experiments, dispatches and runs experiments' trial jobs generated by tuning algorithms to search the best neural architecture and/or hyper-parameters in different training environments like Local Machine, Remote Servers, OpenPAI, Kubeflow, FrameworkController on K8S (AKS etc.) DLWorkspace (aka. DLTS) AML (Azure Machine Learning) and other cloud options.

Who should consider using NNI

NNI {{ release }} has been released!

NNI capabilities in a glance

NNI provides CommandLine Tool as well as an user friendly WebUI to manage training experiements. With the extensible API, you can customize your own AutoML algorithms and training services. To make it easy for new users, NNI also provides a set of build-in stat-of-the-art AutoML algorithms and out of box support for popular training platforms.

Within the following table, we summarized the current NNI capabilities, we are gradually adding new capabilities and we'd love to have your contribution.

Frameworks & Libraries Algorithms Training Services
Built-in
  • Supported Frameworks
    • PyTorch
    • Keras
    • TensorFlow
    • MXNet
    • Caffe2
    • More...
  • Supported Libraries
    • Scikit-learn
    • XGBoost
    • LightGBM
    • More...
Hyperparameter Tuning Neural Architecture Search Model Compression Feature Engineering (Beta) Early Stop Algorithms
References

Installation

Install

NNI supports and is tested on Ubuntu >= 16.04, macOS >= 10.14.1, and Windows 10 >= 1809. Simply run the following `pip install` in an environment that has `python 64-bit >= 3.6`.

Linux or macOS
python3 -m pip install --upgrade nni
Windows
python -m pip install --upgrade nni

If you want to try latest code, please install NNI from source code.

For detail system requirements of NNI, please refer to here for Linux & macOS, and here for Windows.

Note:

Verify installation

The following example is built on TensorFlow 1.x. Make sure TensorFlow 1.x is used when running it.

Documentation

Contributing

This project welcomes contributions and suggestions. Most contributions require you to agree to a Contributor License Agreement (CLA) declaring that you have the right to, and actually do, grant us the rights to use your contribution. For details, visit https://cla.microsoft.com.

When you submit a pull request, a CLA-bot will automatically determine whether you need to provide a CLA and decorate the PR appropriately (e.g., label, comment). Simply follow the instructions provided by the bot. You will only need to do this once across all repos using our CLA.

This project has adopted the Microsoft Open Source Code of Conduct. For more information see the Code of Conduct FAQ or contact opencode@microsoft.com with any additional questions or comments.

After getting familiar with contribution agreements, you are ready to create your first PR =), follow the NNI developer tutorials to get start:

External Repositories and References

With authors' permission, we listed a set of NNI usage examples and relevant articles.

Feedback

Join IM discussion groups:
Gitter WeChat
Gitter OR NNI Wechat

Related Projects

Targeting at openness and advancing state-of-art technology, Microsoft Research (MSR) had also released few other open source projects.

We encourage researchers and students leverage these projects to accelerate the AI development and research.

License

The entire codebase is under MIT license

{% endblock %}