提交 403cb700 编写于 作者: M mindspore-ci-bot 提交者: Gitee

!5565 update RELEASE

Merge pull request !5565 from mengchunyang/r0.7
......@@ -70,7 +70,7 @@
* add optimized convolution1X1/3X3/depthwise/convolution_transposed for OpenCL.
* Tool & example
* Add benchmark and TimeProfile tools.
* Add image classification and object detection Android Demo.
* Add image classification Android Demo.
## Bugfixes
* Models
......
......@@ -6,7 +6,7 @@ MindSpore lite is a high-performance, lightweight open source reasoning framewor
<img src="../../docs/MindSpore-Lite-architecture.png" alt="MindSpore Lite Architecture" width="600"/>
For more details please check out our [MindSpore Lite Architecture Guide](https://www.mindspore.cn/lite/docs/en/master/architecture.html).
For more details please check out our [MindSpore Lite Architecture Guide](https://www.mindspore.cn/lite/docs/en/r0.7/architecture.html).
### MindSpore Lite features
......@@ -35,13 +35,13 @@ For more details please check out our [MindSpore Lite Architecture Guide](https:
The MindSpore team provides a series of pre-training models used for image classification, object detection. You can use these pre-trained models in your application.
The pre-trained models provided by MindSpore include: [Image Classification](https://download.mindspore.cn/model_zoo/official/lite/) and [Object Detection](https://download.mindspore.cn/model_zoo/official/lite/). More models will be provided in the feature.
The pre-trained model provided by MindSpore: [Image Classification](https://download.mindspore.cn/model_zoo/official/lite/). More models will be provided in the feature.
MindSpore allows you to retrain pre-trained models to perform other tasks. For example: using a pre-trained image classification model, it can be retrained to recognize new image types.
2. Model converter and optimization
If you use MindSpore or a third-party model, you need to use [MindSpore Lite Model Converter Tool](https://www.mindspore.cn/lite/tutorial/en/master/use/converter_tool.html) to convert the model into MindSpore Lite model. The MindSpore Lite model converter tool provides the converter of TensorFlow Lite, Caffe, ONNX to MindSpore Lite model, fusion and quantization could be introduced during convert procedure.
If you use MindSpore or a third-party model, you need to use [MindSpore Lite Model Converter Tool](https://www.mindspore.cn/lite/tutorial/en/r0.7/use/converter_tool.html) to convert the model into MindSpore Lite model. The MindSpore Lite model converter tool provides the converter of TensorFlow Lite, Caffe, ONNX to MindSpore Lite model, fusion and quantization could be introduced during convert procedure.
MindSpore also provides a tool to convert models running on IoT devices .
......@@ -51,17 +51,17 @@ For more details please check out our [MindSpore Lite Architecture Guide](https:
4. Inference
Load the model and perform inference. [Inference](https://www.mindspore.cn/lite/tutorial/en/master/use/runtime.html) is the process of running input data through the model to get output.
Load the model and perform inference. [Inference](https://www.mindspore.cn/lite/tutorial/en/r0.7/use/runtime.html) is the process of running input data through the model to get output.
MindSpore provides a series of pre-trained models that can be deployed on mobile device [example](#TODO).
MindSpore provides pre-trained model that can be deployed on mobile device [example](https://www.mindspore.cn/lite/examples/en).
## MindSpore Lite benchmark test result
Base on MindSpore r0.7, we test a couple of networks on HUAWEI Mate30 (Hisilicon Kirin990) mobile phone, and get the test results below for your reference.
| NetWork | Thread Number | Average Run Time(ms) |
| ------------------- | ------------- | -------------------- |
| basic_squeezenet | 4 | 9.10 |
| inception_v3 | 4 | 69.361 |
| mobilenet_v1_10_224 | 4 | 7.137 |
| mobilenet_v2_10_224 | 4 | 5.569 |
| resnet_v2_50 | 4 | 48.691 |
| NetWork | Thread Number | Average Run Time(ms) |
| ------------------- | ------------- | -------------------- |
| basic_squeezenet | 4 | 9.10 |
| inception_v3 | 4 | 69.361 |
| mobilenet_v1_10_224 | 4 | 7.137 |
| mobilenet_v2_10_224 | 4 | 5.569 |
| resnet_v2_50 | 4 | 48.691 |
......@@ -9,7 +9,7 @@ MindSpore Lite是MindSpore推出的端云协同的、轻量化、高性能AI推
<img src="../../docs/MindSpore-Lite-architecture.png" alt="MindSpore Lite Architecture" width="600"/>
欲了解更多详情,请查看我们的[MindSpore Lite 总体架构](https://www.mindspore.cn/lite/docs/zh-CN/master/architecture.html)
欲了解更多详情,请查看我们的[MindSpore Lite 总体架构](https://www.mindspore.cn/lite/docs/zh-CN/r0.7/architecture.html)
## MindSpore Lite技术特点
......@@ -43,13 +43,13 @@ MindSpore Lite是MindSpore推出的端云协同的、轻量化、高性能AI推
MindSpore团队提供了一系列预训练模型,用于解决图像分类、目标检测等场景的学习问题。可以在您的应用程序中使用这些预训练模型对应的终端模型。
MindSpore提供的预训练模型包括:[图像分类(Image Classification)](https://download.mindspore.cn/model_zoo/official/lite/)[目标检测(Object Detection)](https://download.mindspore.cn/model_zoo/official/lite/)。后续MindSpore团队会增加更多的预置模型。
MindSpore提供的预训练模型[图像分类(Image Classification)](https://download.mindspore.cn/model_zoo/official/lite/)。后续MindSpore团队会增加更多的预置模型。
MindSpore允许您重新训练预训练模型,以执行其他任务。比如:使用预训练的图像分类模型,可以重新训练来识别新的图像类型。
2. 模型转换/优化
如果您使用MindSpore或第三方训练的模型,需要使用[MindSpore Lite模型转换工具](https://www.mindspore.cn/lite/tutorial/zh-CN/master/use/converter_tool.html)转换成MindSpore Lite模型格式。MindSpore Lite模型转换工具不仅提供了将TensorFlow Lite、Caffe、ONNX等模型格式转换为MindSpore Lite模型格式,还提供了算子融合、量化等功能。
如果您使用MindSpore或第三方训练的模型,需要使用[MindSpore Lite模型转换工具](https://www.mindspore.cn/lite/tutorial/zh-CN/r0.7/use/converter_tool.html)转换成MindSpore Lite模型格式。MindSpore Lite模型转换工具不仅提供了将TensorFlow Lite、Caffe、ONNX等模型格式转换为MindSpore Lite模型格式,还提供了算子融合、量化等功能。
MindSpore还提供了将IoT设备上运行的模型转换成.C代码的生成工具。
......@@ -61,17 +61,17 @@ MindSpore Lite是MindSpore推出的端云协同的、轻量化、高性能AI推
4. 模型推理
主要完成模型推理工作,即加载模型,完成模型相关的所有计算。[推理](https://www.mindspore.cn/lite/tutorial/zh-CN/master/use/runtime.html)是通过模型运行输入数据,获取预测的过程。
主要完成模型推理工作,即加载模型,完成模型相关的所有计算。[推理](https://www.mindspore.cn/lite/tutorial/zh-CN/r0.7/use/runtime.html)是通过模型运行输入数据,获取预测的过程。
MindSpore提供了一系列预训练模型部署在智能终端的[样例](#TODO)
MindSpore提供了预训练模型部署在智能终端的[样例](https://www.mindspore.cn/lite/examples)
## MindSpore Lite性能参考数据
我们在HUAWEI Mate30(Hisilicon Kirin990)手机上,基于MindSpore r0.7,测试了一组端侧常见网络的性能数据,供您参考:
| 网络 | 线程数 | 平均推理时间(毫秒) |
| ------------------- | ------ | ------------------ |
| basic_squeezenet | 4 | 9.10 |
| inception_v3 | 4 | 69.361 |
| mobilenet_v1_10_224 | 4 | 7.137 |
| mobilenet_v2_10_224 | 4 | 5.569 |
| resnet_v2_50 | 4 | 48.691 |
| 网络 | 线程数 | 平均推理时间(毫秒) |
| ------------------- | ------ | ------------------ |
| basic_squeezenet | 4 | 9.10 |
| inception_v3 | 4 | 69.361 |
| mobilenet_v1_10_224 | 4 | 7.137 |
| mobilenet_v2_10_224 | 4 | 5.569 |
| resnet_v2_50 | 4 | 48.691 |
Markdown is supported
0% .
You are about to add 0 people to the discussion. Proceed with caution.
先完成此消息的编辑!
想要评论请 注册