加入CODE CHINA

· 不限速    · 不限空间    · 不限人数    · 私仓免费

免费加入
    README.md

    MegEngine

    English | 中文

    MegEngine is a fast, scalable and easy-to-use deep learning framework, with auto-differentiation.


    Installation

    NOTE: MegEngine now supports Python installation on Linux-64bit/Windows-64bit/MacOS(CPU-Only)-10.14+ platforms with Python from 3.5 to 3.8. On Windows 10 you can either install the Linux distribution through Windows Subsystem for Linux (WSL) or install the Windows distribution directly. Many other platforms are supported for inference.

    Binaries

    To install the pre-built binaries via pip wheels:

    python3 -m pip install megengine -f https://megengine.org.cn/whl/mge.html

    Building from Source

    Prerequisites

    Most of the dependencies of MegEngine are located in third_party directory, which can be prepared by executing:

    ./third_party/prepare.sh
    ./third_party/install-mkl.sh

    But some dependencies need to be installed manually:

    • CUDA(>=10.1), cuDNN(>=7.6) are required when building MegEngine with CUDA support.
    • TensorRT(>=5.1.5) is required when building with TensorRT support.
    • LLVM/Clang(>=6.0) is required when building with Halide JIT support.
    • Python(>=3.5) and numpy are required to build Python modules.

    Build

    MegEngine uses CMake as the build tool. We provide the following scripts to facilitate building.

    • host_build.sh builds MegEngine that runs on the same host machine (i.e., no cross compiling). The following command displays the usage:
      scripts/cmake-build/host_build.sh -h
    • cross_build_android_arm_inference.sh builds MegEngine for DNN inference on Android-ARM platforms. The following command displays the usage:
      scripts/cmake-build/cross_build_android_arm_inference.sh -h
    • cross_build_linux_arm_inference.sh builds MegEngine for DNN inference on Linux-ARM platforms. The following command displays the usage:
      scripts/cmake-build/cross_build_linux_arm_inference.sh -h
    • cross_build_ios_arm_inference.sh builds MegEngine for DNN inference on iOS (iPhone/iPad) platforms. The following command displays the usage:
      scripts/cmake-build/cross_build_ios_arm_inference.sh

    Please refer to BUILD_README.md for more details.

    How to Contribute

    We strive to build an open and friendly community. We aim to power humanity with AI.

    How to Contact Us

    Resources

    License

    MegEngine is Licensed under the Apache License, Version 2.0

    Copyright (c) 2014-2021 Megvii Inc. All rights reserved.

    项目简介

    MegEngine 是一个快速、可拓展、易于使用且支持自动求导的深度学习框架

    发行版本 15

    MegEngine v1.4.0

    全部发行版

    贡献者 13

    全部贡献者

    开发语言

    • C++ 79.8 %
    • Cuda 13.8 %
    • Python 4.9 %
    • C 0.9 %
    • CMake 0.5 %