Release 1.0.2 Xgboost Developers
Total Page:16
File Type:pdf, Size:1020Kb
xgboost Release 1.0.2 xgboost developers Apr 15, 2020 CONTENTS 1 Contents 3 1.1 Installation Guide............................................3 1.2 Get Started with XGBoost........................................ 11 1.3 XGBoost Tutorials............................................ 12 1.4 Frequently Asked Questions....................................... 49 1.5 XGBoost GPU Support......................................... 50 1.6 XGBoost Parameters........................................... 55 1.7 XGBoost Python Package........................................ 65 1.8 XGBoost R Package........................................... 113 1.9 XGBoost JVM Package......................................... 127 1.10 XGBoost.jl................................................ 144 1.11 XGBoost C Package........................................... 144 1.12 XGBoost C++ API............................................ 145 1.13 XGBoost Command Line version.................................... 145 1.14 Contribute to XGBoost.......................................... 145 Python Module Index 155 Index 157 i ii xgboost, Release 1.0.2 XGBoost is an optimized distributed gradient boosting library designed to be highly efficient, flexible and portable. It implements machine learning algorithms under the Gradient Boosting framework. XGBoost provides a parallel tree boosting (also known as GBDT, GBM) that solve many data science problems in a fast and accurate way. The same code runs on major distributed environment (Hadoop, SGE, MPI) and can solve problems beyond billions of examples. CONTENTS 1 xgboost, Release 1.0.2 2 CONTENTS CHAPTER ONE CONTENTS 1.1 Installation Guide Note: Pre-built binary wheel for Python If you are planning to use Python, consider installing XGBoost from a pre-built binary wheel, available from Python Package Index (PyPI). You may download and install it by running # Ensure that you are downloading one of the following: # * xgboost-{version}-py2.py3-none-manylinux1_x86_64.whl # * xgboost-{version}-py2.py3-none-win_amd64.whl pip3 install xgboost • The binary wheel will support GPU algorithms (gpu_hist) on machines with NVIDIA GPUs. Please note that training with multiple GPUs is only supported for Linux platform. See XGBoost GPU Support. • Currently, we provide binary wheels for 64-bit Linux and Windows. • Nightly builds are available. You can now run pip install https://s3-us-west-2.amazonaws.com/xgboost-nightly- builds/xgboost-[version]+[commit hash]-py2.py3-none-manylinux1_x86_64.whl to install the nightly build with the given commit hash. See this page to see the list of all nightly builds. 1.1.1 Building XGBoost from source This page gives instructions on how to build and install XGBoost from scratch on various systems. It consists of two steps: 1. First build the shared library from the C++ codes (libxgboost.so for Linux/OSX and xgboost.dll for Windows). (For R-package installation, please directly refer to R Package Installation.) 2. Then install the language packages (e.g. Python Package). Note: Use of Git submodules XGBoost uses Git submodules to manage dependencies. So when you clone the repo, remember to specify --recursive option: git clone --recursive https://github.com/dmlc/xgboost For windows users who use github tools, you can open the git shell and type the following command: 3 xgboost, Release 1.0.2 git submodule init git submodule update Please refer to Trouble Shooting section first if you have any problem during installation. If the instructions do not work for you, please feel free to ask questions at the user forum. Contents • Building the Shared Library – Building on Ubuntu/Debian – Building on OSX – Building on Windows – Building with GPU support – Customized Building • Python Package Installation • R Package Installation • Trouble Shooting • Building the documentation 1.1.2 Building the Shared Library Our goal is to build the shared library: • On Linux/OSX the target library is libxgboost.so • On Windows the target library is xgboost.dll The minimal building requirement is • A recent C++ compiler supporting C++11 (g++-5.0 or higher) • CMake 3.3 or higher (3.12 for building with CUDA) For a list of CMake options, see #-- Options in CMakeLists.txt on top of source tree. Building on Ubuntu/Debian On Ubuntu, one builds XGBoost by running CMake: git clone --recursive https://github.com/dmlc/xgboost cd xgboost mkdir build cd build cmake .. make -j4 4 Chapter 1. Contents xgboost, Release 1.0.2 Building on OSX Install with pip: simple method First, obtain the OpenMP library (libomp) with Homebrew (https://brew.sh/) to enable multi-threading (i.e. using multiple CPU threads for training): brew install libomp Then install XGBoost with pip: pip3 install xgboost You might need to run the command with --user flag if you run into permission errors. Build from the source code - advanced method Obtain libomp from Homebrew: brew install libomp Now clone the repository: git clone --recursive https://github.com/dmlc/xgboost Create the build/ directory and invoke CMake. After invoking CMake, you can build XGBoost with make: mkdir build cd build cmake .. make -j4 You may now continue to Python Package Installation. Building on Windows You need to first clone the XGBoost repo with --recursive option, to clone the submodules. We recommend you use Git for Windows, as it comes with a standard Bash shell. This will highly ease the installation process. git submodule init git submodule update XGBoost support compilation with Microsoft Visual Studio and MinGW. 1.1. Installation Guide 5 xgboost, Release 1.0.2 Compile XGBoost with Microsoft Visual Studio To build with Visual Studio, we will need CMake. Make sure to install a recent version of CMake. Then run the following from the root of the XGBoost directory: mkdir build cd build cmake .. -G"Visual Studio 14 2015 Win64" This specifies an out of source build using the Visual Studio 64 bit generator. (Change the -G option appropriately if you have a different version of Visual Studio installed.) Open the .sln file in the build directory and build with Visual Studio. After the build process successfully ends, you will find a xgboost.dll library file inside ./lib/ folder. Compile XGBoost using MinGW After installing Git for Windows, you should have a shortcut named Git Bash. You should run all subsequent steps in Git Bash. In MinGW, make command comes with the name mingw32-make. You can add the following line into the . bashrc file: alias make='mingw32-make' (On 64-bit Windows, you should get MinGW64 instead.) Make sure that the path to MinGW is in the system PATH. To build with MinGW, type: cp make/mingw64.mk config.mk; make -j4 See Building XGBoost library for Python for Windows with MinGW-w64 (Advanced) for buildilng XGBoost for Python. Building with GPU support XGBoost can be built with GPU support for both Linux and Windows using CMake. GPU support works with the Python package as well as the CLI version. See Installing R package with GPU support for special instructions for R. An up-to-date version of the CUDA toolkit is required. From the command line on Linux starting from the XGBoost directory: mkdir build cd build cmake .. -DUSE_CUDA=ON make -j4 Note: Enabling distributed GPU training By default, distributed GPU training is disabled and only a single GPU will be used. To enable distributed GPU training, set the option USE_NCCL=ON. Distributed GPU training depends on NCCL2, available at this link. Since NCCL2 is only available for Linux machines, distributed GPU training is available only for Linux. 6 Chapter 1. Contents xgboost, Release 1.0.2 mkdir build cd build cmake .. -DUSE_CUDA=ON -DUSE_NCCL=ON -DNCCL_ROOT=/path/to/nccl2 make -j4 On Windows, run CMake as follows: mkdir build cd build cmake .. -G"Visual Studio 14 2015 Win64" -DUSE_CUDA=ON (Change the -G option appropriately if you have a different version of Visual Studio installed.) Note: Visual Studio 2017 Win64 Generator may not work Choosing the Visual Studio 2017 generator may cause compilation failure. When it happens, specify the 2015 compiler by adding the -T option: cmake .. -G"Visual Studio 15 2017 Win64" -T v140,cuda=8.0 -DUSE_CUDA=ON To speed up compilation, the compute version specific to your GPU could be passed to cmake as, e.g., -DGPU_COMPUTE_VER=50. The above cmake configuration run will create an xgboost.sln solution file in the build directory. Build this solution in release mode as a x64 build, either from Visual studio or from command line: cmake --build . --target xgboost --config Release To speed up compilation, run multiple jobs in parallel by appending option -- /MP. Customized Building We recommend the use of CMake for most use cases. See the full range of building options in CMakeLists.txt. Alternatively, you may use Makefile. The Makefile uses a configuration file config.mk, which lets you modify several compilation flags: - Whether to enable support for various distributed filesystems such as HDFS and Amazon S3 - Which compiler to use - And some more To customize, first copy make/config.mk to the project root and then modify the copy. Python Package Installation The Python package is located at python-package/. There are several ways to install the package: 1. Install system-wide, which requires root permission: cd python-package; sudo python setup.py install You will however need Python distutils module for this to work. It is often part of the core Python package or it can be installed using your package manager, e.g. in Debian use sudo apt-get install python-setuptools 1.1. Installation Guide 7 xgboost, Release 1.0.2 Note: Re-compiling XGBoost If you recompiled XGBoost, then you need to reinstall it again to make the new library take effect. 2. Only set the environment variable PYTHONPATH to tell Python where to find the library. For example, assume we cloned xgboost on the home directory ~. then we can added the following line in ~/.bashrc. This option is recommended for developers who change the code frequently. The changes will be immediately reflected once you pulled the code and rebuild the project (no need to call setup again).