README.md 2.8 KB
Newer Older
D
Dong Daxiang 已提交
1
# Paddle Serving
D
Dong Daxiang 已提交
2
An easy-to-use Machine Learning Model Inference Service Deployment Tool
D
Dong Daxiang 已提交
3

D
Dong Daxiang 已提交
4
[![Release](https://img.shields.io/badge/Release-0.0.3-yellowgreen)](Release)
D
Dong Daxiang 已提交
5
[![Issues](https://img.shields.io/github/issues/PaddlePaddle/Serving)](Issues)
D
Dong Daxiang 已提交
6 7
[![License](https://img.shields.io/github/license/PaddlePaddle/Serving)](LICENSE)

D
Dong Daxiang 已提交
8
[中文](./doc/README_CN.md)
D
Dong Daxiang 已提交
9

D
Dong Daxiang 已提交
10 11
## Motivation
Paddle Serving helps deep learning developers deploy an online inference service without much effort. Currently, Paddle Serving supports the deep learning models trained by [Paddle](https://github.com/PaddlePaddle/Paddle) althought it can be very easy to integrate other deep learning framework's model inference engine.  
D
Dong Daxiang 已提交
12

D
Dong Daxiang 已提交
13 14 15 16 17 18
## Key Features
- Integrate with Paddle training pipeline seemlessly, most paddle models can be deployed with one line command.
- Industrial serving features supported, such as multiple models management, model online loading, online A/B testing etc.
- Distributed Key-Value indexing supported that is especially useful for large scale sparse features as model inputs.
- Highly concurrent and efficient communication, with [Baidu-rpc](https://github.com/apache/incubator-brpc) supported.
- Multiple programming language supported on client side, such as Golang, C++ and python
D
Dong Daxiang 已提交
19 20 21 22 23 24 25 26 27 28 29 30

## Quick Start

Paddle Serving supports light-weighted Python API for model inference and can be integrated with trainining process seemlessly. Here is a Boston House Pricing example for users to do quick start.

### Installation

```shell
pip install paddle-serving-client
pip install paddle-serving-server
```

D
Dong Daxiang 已提交
31
### Download models and start server
D
Dong Daxiang 已提交
32
``` shell
D
Dong Daxiang 已提交
33
wget --no-check-certificate https://paddle-serving.bj.bcebos.com/uci_housing.tar.gz
D
Dong Daxiang 已提交
34
tar -xzf uci_housing.tar.gz
D
Dong Daxiang 已提交
35
python -m paddle_serving_server.serve --model uci_housing_model --thread 10 --port 9292
D
Dong Daxiang 已提交
36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52
```

### Client Side Scripts

```
from paddle_serving_client import Client
import paddle
import sys

client = Client()
client.load_client_config(sys.argv[1])
client.connect(["127.0.0.1:9292"])

test_reader = paddle.batch(paddle.reader.shuffle(
    paddle.dataset.uci_housing.test(), buf_size=500), batch_size=1)

for data in test_reader():
D
Dong Daxiang 已提交
53 54
    fetch_map = client.predict(feed={"x": data[0][0]}, fetch=["price"])
    print("{} {}".format(fetch_map["price"][0], data[0][1][0]))
D
Dong Daxiang 已提交
55 56 57 58 59 60 61 62 63

```



### Document

[Design Doc(Chinese)](doc/DESIGN.md)

D
Dong Daxiang 已提交
64 65 66 67 68 69
[How to config Serving native operators on server side?](doc/SERVER_OP.md)

[How to develop a new Serving operator](doc/OPERATOR.md)

[Client API for other programming languages](doc/CLIENT_API.md)

D
Dong Daxiang 已提交
70 71 72 73
[FAQ(Chinese)](doc/FAQ.md)

### Advanced features and development

D
Dong Daxiang 已提交
74
[Compile from source code(Chinese)](doc/COMPILE.md)
D
Dong Daxiang 已提交
75 76 77

## Contribution

D
Dong Daxiang 已提交
78
If you want to contribute code to Paddle Serving, please reference [Contribution Guidelines](doc/CONTRIBUTE.md)