未验证 提交 669c29d1 编写于 作者: Z Zhi Li 提交者: GitHub

Feature/rename integer models (#722)

* Globally replace vmaf_v0.6.1.pkl with vmaf_float_v0.6.1.pkl.

* Globally replace vmaf_v0.6.1.json with vmaf_float_v0.6.1.json.

* Fix: replace remaining vmaf_v0.6.1.pkl with vmaf_float_v0.6.1.pkl.

* Rename vamf_v0.6.1neg models to vmaf_float_v0.6.1neg.

* Rename vmaf_b_v0.6.3 to vmaf_float_b_v0.6.3.

* Rename vmaf_int_xxx models to vmaf_xxx models.
上级 d59a7eb0
......@@ -76,7 +76,7 @@ jobs:
- name: Test ffmpeg
run: |
curl "https://gist.githubusercontent.com/1480c1/0c4575da638ef6e8203feffd0597de16/raw/akiyo_cif.tar.xz.base64" | base64 -d | xz -d | 7z x -si -ttar
vmaf_score=$(ffmpeg -hide_banner -nostdin -nostats -i encoded.mkv -i orig.mkv -filter_complex libvmaf=model_path=model/vmaf_v0.6.1.pkl -f null - 2>&1 | tee temp.txt | grep 'VMAF score' | tr ' ' '\n' | tail -n1)
vmaf_score=$(ffmpeg -hide_banner -nostdin -nostats -i encoded.mkv -i orig.mkv -filter_complex libvmaf=model_path=model/vmaf_float_v0.6.1.pkl -f null - 2>&1 | tee temp.txt | grep 'VMAF score' | tr ' ' '\n' | tail -n1)
cat temp.txt
echo "$vmaf_score"
if [[ $vmaf_score != "93.656203" ]]; then
......
......@@ -13,7 +13,7 @@
- Restructure python project and documentation (#544).
- Move test resource to Netflix/vmaf_resource repo (#552).
- Add Github CI (#558).
- Add vmaf_v0.6.1neg model; add vif_enhn_gain_limit and adm_enhn_gain_limit options to vmaf_rc.
- Add vmaf_float_v0.6.1neg model; add vif_enhn_gain_limit and adm_enhn_gain_limit options to vmaf_rc.
- Update documentation for FFmpeg+libvmaf.
- Improvements to AucPerfMetric (#643).
- Add motion_force_zero option to vmaf_rc.
......
......@@ -6,7 +6,7 @@ A: It is associated with the underlying assumption of VMAF on the subject viewin
Fundamentally, any perceptual quality model should take into account the viewing distance and the display size (or the ratio between the two). The same distorted video, if viewed closed-up, could contain more visual artifacts hence yield lower perceptual quality.
In the case of the default VMAF model (`model/vmaf_v0.6.1.pkl`), which is trained to predict the quality of videos displayed on a 1080p HDTV in a living-room-like environment, all the subjective data were collected in such a way that the distorted videos get rescaled to 1080 resolution and displayed with a viewing distance of three times the screen height (3H), or angular resolution of 60 pixels/degree. Effectively, what the VMAF model trying to capture is the perceptual quality of a 1080 video displayed from 3H away. That’s the implicit assumption of the default VMAF model.
In the case of the default VMAF model (`model/vmaf_float_v0.6.1.pkl`), which is trained to predict the quality of videos displayed on a 1080p HDTV in a living-room-like environment, all the subjective data were collected in such a way that the distorted videos get rescaled to 1080 resolution and displayed with a viewing distance of three times the screen height (3H), or angular resolution of 60 pixels/degree. Effectively, what the VMAF model trying to capture is the perceptual quality of a 1080 video displayed from 3H away. That’s the implicit assumption of the default VMAF model.
Now, think about what it means when the VMAF score is calculated on a reference/distorted video pair of 480 resolution. It is similar to as if the 480 video is CROPPED from a 1080 video. If the height of the 480 video is H’, then H’ = 480 / 1080 * H = 0.44 * H, where H is the height of the 1080 video displayed. As a result, VMAF is modeling the viewing distance of 3*H = 6.75*H’. In other words, if you calculate VMAF on the 480 resolution video pair, you are going to predict the perceptual quality of viewing distance 6.75 times its height. This is going to hide a lot of artifacts, hence yielding a very high score.
......@@ -30,7 +30,7 @@ To provide some flexibility, in command-line tools *run_vmaf*, *run_psnr*, *run_
#### Q: Will VMAF work on 4K videos?
A: The default VMAF model at `model/vmaf_v0.6.1.pkl` was trained on videos encoded at resolutions *up to* 1080p. It is still useful for measuring 4K videos, if you are interested in a relative score. In other words, for two 4K videos A and B with A perceptually better than B, the VMAF scores will tell you so too. However, if you are interested in an absolute score, say if a 4K video is perceptually acceptable, you may not get an accurate answer.
A: The default VMAF model at `model/vmaf_float_v0.6.1.pkl` was trained on videos encoded at resolutions *up to* 1080p. It is still useful for measuring 4K videos, if you are interested in a relative score. In other words, for two 4K videos A and B with A perceptually better than B, the VMAF scores will tell you so too. However, if you are interested in an absolute score, say if a 4K video is perceptually acceptable, you may not get an accurate answer.
As of VDK v1.3.7 (June 2018), we have added a new 4K model at `model/vmaf_4k_v0.6.1.pkl`, which is trained to predict 4KTV viewing at distance of 1.5X the display height. Refer to [this](resource/doc/models.md/#predict-quality-on-a-4ktv-screen-at-15h) section for details.
......
......@@ -61,7 +61,7 @@ ninja -vC build doc/html
## Example
The following example shows a comparison using a pair of yuv inputs (`src01_hrc00_576x324.yuv`, `src01_hrc01_576x324.yuv`). In addition to VMAF which is enabled with the model `../model/vmaf_v0.6.1.pkl`, the `psnr` metric is also computed and logged.
The following example shows a comparison using a pair of yuv inputs (`src01_hrc00_576x324.yuv`, `src01_hrc01_576x324.yuv`). In addition to VMAF which is enabled with the model `../model/vmaf_float_v0.6.1.pkl`, the `psnr` metric is also computed and logged.
```sh
wget https://github.com/Netflix/vmaf_resource/raw/master/python/test/resource/yuv/src01_hrc00_576x324.yuv
......@@ -70,7 +70,7 @@ wget https://github.com/Netflix/vmaf_resource/raw/master/python/test/resource/yu
--reference src01_hrc00_576x324.yuv \
--distorted src01_hrc01_576x324.yuv \
--width 576 --height 324 --pixel_format 420 --bitdepth 8 \
--model path=../model/vmaf_v0.6.1.pkl \
--model path=../model/vmaf_float_v0.6.1.pkl \
--feature psnr \
--output /dev/stdout
```
......
......@@ -65,14 +65,14 @@ static char *test_json_model()
VmafModel *model_json;
VmafModelConfig cfg_json = { 0 };
const char *path_json = "../../model/vmaf_v0.6.1neg.json";
const char *path_json = "../../model/vmaf_float_v0.6.1neg.json";
err = vmaf_read_json_model_from_path(&model_json, &cfg_json, path_json);
mu_assert("problem during vmaf_read_json_model", !err);
VmafModel *model_pkl;
VmafModelConfig cfg_pkl = { 0 };
const char *path_pkl = "../../model/vmaf_v0.6.1neg.pkl";
const char *path_pkl = "../../model/vmaf_float_v0.6.1neg.pkl";
err = vmaf_model_load_from_path(&model_pkl, &cfg_pkl, path_pkl);
mu_assert("problem during vmaf_model_load_from_path", !err);
......@@ -101,7 +101,7 @@ static char *test_model_collection()
mu_assert("problem during load_model_collection", !err);
// json
const char *json_path = "../../model/vmaf_b_v0.6.3.json";
const char *json_path = "../../model/vmaf_float_b_v0.6.3.json";
VmafModel *json_model;
VmafModelCollection *json_model_collection = NULL;
const VmafModelConfig json_cfg = { 0 };
......@@ -124,7 +124,7 @@ static char *test_model_load_and_destroy()
VmafModel *model;
VmafModelConfig cfg = { 0 };
const char *path = "../../model/vmaf_v0.6.1.pkl";
const char *path = "../../model/vmaf_float_v0.6.1.pkl";
err = vmaf_model_load_from_path(&model, &cfg, path);
mu_assert("problem during vmaf_model_load_from_path", !err);
......@@ -151,7 +151,7 @@ static char *test_model_check_default_behavior_unset_flags()
VmafModelConfig cfg = {
.name = "some_vmaf",
};
const char *path = "../../model/vmaf_v0.6.1.pkl";
const char *path = "../../model/vmaf_float_v0.6.1.pkl";
err = vmaf_model_load_from_path(&model, &cfg, path);
mu_assert("problem during vmaf_model_load_from_path", !err);
mu_assert("Model name is inconsistent.\n", !strcmp(model->name, "some_vmaf"));
......@@ -174,7 +174,7 @@ static char *test_model_check_default_behavior_set_flags()
.name = "some_vmaf",
.flags = VMAF_MODEL_FLAGS_DEFAULT,
};
const char *path = "../../model/vmaf_v0.6.1.pkl";
const char *path = "../../model/vmaf_float_v0.6.1.pkl";
err = vmaf_model_load_from_path(&model, &cfg, path);
mu_assert("problem during vmaf_model_load_from_path", !err);
mu_assert("Model name is inconsistent.\n", !strcmp(model->name, "some_vmaf"));
......@@ -196,7 +196,7 @@ static char *test_model_set_flags()
VmafModelConfig cfg1 = {
.flags = VMAF_MODEL_FLAG_ENABLE_TRANSFORM,
};
const char *path1 = "../../model/vmaf_v0.6.1.pkl";
const char *path1 = "../../model/vmaf_float_v0.6.1.pkl";
err = vmaf_model_load_from_path(&model1, &cfg1, path1);
mu_assert("problem during vmaf_model_load_from_path", !err);
mu_assert("Score transform must be enabled.\n",
......@@ -209,7 +209,7 @@ static char *test_model_set_flags()
VmafModelConfig cfg2 = {
.flags = VMAF_MODEL_FLAG_DISABLE_CLIP,
};
const char *path2 = "../../model/vmaf_v0.6.1.pkl";
const char *path2 = "../../model/vmaf_float_v0.6.1.pkl";
err = vmaf_model_load_from_path(&model2, &cfg2, path2);
mu_assert("problem during vmaf_model_load_from_path", !err);
mu_assert("Score transform must be disabled.\n",
......@@ -220,7 +220,7 @@ static char *test_model_set_flags()
VmafModel *model3;
VmafModelConfig cfg3 = { 0 };
const char *path3 = "../../model/vmaf_v0.6.1.pkl";
const char *path3 = "../../model/vmaf_float_v0.6.1.pkl";
err = vmaf_model_load_from_path(&model3, &cfg3, path3);
mu_assert("problem during vmaf_model_load_from_path", !err);
mu_assert("feature[0].opts_dict must be NULL.\n",
......@@ -238,7 +238,7 @@ static char *test_model_set_flags()
VmafModel *model4;
VmafModelConfig cfg4 = { 0 };
const char *path4 = "../../model/vmaf_v0.6.1neg.pkl";
const char *path4 = "../../model/vmaf_float_v0.6.1neg.pkl";
err = vmaf_model_load_from_path(&model4, &cfg4, path4);
mu_assert("problem during vmaf_model_load_from_path", !err);
mu_assert("feature[0].opts_dict must not be NULL.\n",
......
......@@ -36,7 +36,7 @@ static char *test_predict_score_at_index()
.name = "vmaf",
.flags = VMAF_MODEL_FLAGS_DEFAULT,
};
const char *path = "../../model/vmaf_v0.6.1.pkl";
const char *path = "../../model/vmaf_float_v0.6.1.pkl";
err = vmaf_model_load_from_path(&model, &cfg, path);
mu_assert("problem during vmaf_model_load_from_path", !err);
......
此差异已折叠。
......@@ -29,12 +29,12 @@
100.0
],
"feature_names": [
"VMAF_integer_feature_adm2_score",
"VMAF_integer_feature_motion2_score",
"VMAF_integer_feature_vif_scale0_score",
"VMAF_integer_feature_vif_scale1_score",
"VMAF_integer_feature_vif_scale2_score",
"VMAF_integer_feature_vif_scale3_score"
"VMAF_feature_adm2_score",
"VMAF_feature_motion2_score",
"VMAF_feature_vif_scale0_score",
"VMAF_feature_vif_scale1_score",
"VMAF_feature_vif_scale2_score",
"VMAF_feature_vif_scale3_score"
],
"intercepts": [
-0.3092981927591963,
......@@ -56,7 +56,7 @@
2.08656468286432
],
"feature_dict": {
"VMAF_integer_feature": [
"VMAF_feature": [
"vif_scale0",
"vif_scale1",
"vif_scale2",
......
......@@ -55,12 +55,12 @@
100.0
],
"feature_names": [
"VMAF_integer_feature_adm2_score",
"VMAF_integer_feature_motion2_score",
"VMAF_integer_feature_vif_scale0_score",
"VMAF_integer_feature_vif_scale1_score",
"VMAF_integer_feature_vif_scale2_score",
"VMAF_integer_feature_vif_scale3_score"
"VMAF_feature_adm2_score",
"VMAF_feature_motion2_score",
"VMAF_feature_vif_scale0_score",
"VMAF_feature_vif_scale1_score",
"VMAF_feature_vif_scale2_score",
"VMAF_feature_vif_scale3_score"
],
"intercepts": [
-0.3092981927591963,
......@@ -74,7 +74,7 @@
"model_type": "LIBSVMNUSVR",
"model": "svm_type nu_svr\nkernel_type rbf\ngamma 0.04\nnr_class 2\ntotal_sv 211\nrho -1.33133\nSV\n-4 1:0.65734273 2:0.34681232 3:0.093755557 4:0.60913934 5:0.69117362 6:0.73495824 \n4 1:0.8727433 2:0.49612229 3:0.59146724 4:0.78105663 5:0.84916292 6:0.8882561 \n4 1:0.89890005 2:0.49612229 3:0.66823667 4:0.86050887 5:0.90873162 6:0.93335071 \n4 1:0.20371751 2:0.49612229 3:0.10534315 4:-1.110223e-16 6:2.220446e-16 \n4 1:0.33913836 2:0.49612229 3:0.14024497 4:0.074708413 5:0.10231651 6:0.1259153 \n4 1:0.66426757 2:0.49612229 3:0.35268026 4:0.4805681 5:0.59603341 6:0.67408692 \n4 1:0.59561632 2:0.49612229 3:0.27561601 4:0.33977371 5:0.4325213 6:0.50244952 \n4 1:0.50821444 2:0.49612229 3:0.20276685 4:0.2004308 5:0.25758651 6:0.30054029 \n4 1:0.77877298 2:0.49612229 3:0.444392 4:0.61630491 5:0.71210086 6:0.77386496 \n4 1:0.71666017 2:0.49612229 3:0.35967401 4:0.47825205 5:0.57045236 6:0.63752441 \n4 1:0.64025669 2:0.49612229 3:0.27766156 4:0.33407105 5:0.40732401 6:0.46359154 \n4 1:0.88343983 2:0.23066177 3:0.65873851 4:0.86090402 5:0.90661213 6:0.93008753 \n4 1:0.90822691 2:0.23066177 3:0.71439481 4:0.90904598 5:0.94146542 6:0.95674338 \n-4 1:0.49037399 2:0.23066177 3:0.32329421 4:0.33686197 5:0.39456977 6:0.44944683 \n-4 1:0.69044383 2:0.23066177 3:0.43933868 4:0.56327049 5:0.65339511 6:0.71348696 \n-4 1:0.62390093 2:0.23066177 3:0.3800888 4:0.44927578 5:0.52327759 6:0.57907725 \n4 1:0.81887942 2:0.23066177 3:0.56208506 4:0.76164281 5:0.83176644 6:0.86914911 \n4 1:0.77189471 2:0.23066177 3:0.50145055 4:0.66525882 5:0.74327951 6:0.79017822 \n4 1:0.71405433 2:0.23066177 3:0.43952897 4:0.55736023 5:0.63319876 6:0.68402869 \n4 1:0.92114073 3:0.45198963 4:0.97703695 5:0.9907273 6:0.99510256 \n4 1:1 3:0.83319067 4:0.98956086 5:0.99577089 6:0.99784595 \n4 4:0.10344019 5:0.34323945 6:0.63855969 \n4 1:0.19531482 3:0.034330388 4:0.25480402 5:0.54197045 6:0.78020579 \n4 1:0.48394064 3:0.11866359 4:0.58816959 5:0.86435738 6:0.96191842 \n4 1:0.47628079 3:0.11185039 4:0.56180003 5:0.83415721 6:0.93617329 \n4 1:0.46278632 3:0.10308547 4:0.52247575 5:0.78583924 6:0.89392193 \n4 1:0.7038079 3:0.2174879 4:0.84423613 5:0.9662906 6:0.98430594 \n4 1:0.69596686 3:0.20657211 4:0.81196884 5:0.94140702 6:0.96680805 \n4 1:0.68404358 3:0.19261438 4:0.76066415 5:0.89973293 6:0.93660362 \n4 1:0.84073022 2:0.34681232 3:0.22411304 4:0.88845644 5:0.94169671 6:0.96221395 \n-4 1:0.33900937 2:0.34681232 3:0.027607294 4:0.40659646 5:0.45456869 6:0.48256597 \n-4 1:0.44593129 2:0.34681232 3:0.041939301 4:0.45284872 5:0.5157613 6:0.55335821 \n-4 1:0.67301747 2:0.34681232 3:0.11526222 4:0.68549511 5:0.78556255 6:0.83507583 \n-4 1:0.62833533 2:0.34681232 3:0.092281981 4:0.61278125 5:0.70626575 6:0.75613977 \n-4 1:0.57196879 2:0.34681232 3:0.067548447 4:0.53383404 5:0.61287548 6:0.65468717 \n-0.3312466607741135 1:0.75125028 2:0.34681232 3:0.1457048 4:0.75791308 5:0.84155109 6:0.88132116 \n-4 1:0.71121936 2:0.34681232 3:0.12095689 4:0.68834617 5:0.77453583 6:0.81892861 \n-4 1:0.80269544 2:0.25207203 3:0.3681723 4:0.80658472 5:0.8702283 6:0.90583519 \n-4 1:0.86095387 2:0.25207203 3:0.52475418 4:0.85053413 5:0.90454501 6:0.93093678 \n-4 1:0.5008963 2:0.25207203 3:0.2005129 4:0.41516485 5:0.45282017 6:0.47396143 \n-4 1:0.56977992 2:0.25207203 3:0.21631076 4:0.45848604 5:0.51102137 6:0.53823055 \n-4 1:0.72779828 2:0.25207203 3:0.3051639 4:0.67537297 5:0.75767261 6:0.80327187 \n-4 1:0.68848569 2:0.25207203 3:0.27393051 4:0.60399854 5:0.68000038 6:0.72275152 \n-4 1:0.64121401 2:0.25207203 3:0.23994344 4:0.52538719 5:0.5891732 6:0.62164073 \n-4 1:0.76673633 2:0.25207203 3:0.33053889 4:0.73085549 5:0.80341439 6:0.84546456 \n-4 1:0.73041172 2:0.25207203 3:0.29691153 4:0.66166141 5:0.73408074 6:0.77757209 \n-4 1:0.68529047 2:0.25207203 3:0.26283557 4:0.58611788 5:0.65192525 6:0.69015011 \n4 1:0.86902267 2:0.48885268 3:0.5143645 4:0.8587242 5:0.91841685 6:0.94498293 \n4 1:0.89266106 2:0.48885268 3:0.55208861 4:0.89938377 5:0.94642982 6:0.96615102 \n-4 1:0.42554844 2:0.48885268 3:0.2554221 4:0.36916892 5:0.43100226 6:0.50888404 \n-4 1:0.52520274 2:0.48885268 3:0.27824915 4:0.42915458 5:0.50850476 6:0.58585271 \n-4 1:0.69357445 2:0.48885268 3:0.35289928 4:0.61359907 5:0.7217863 6:0.78790011 \n-4 1:0.64679648 2:0.48885268 3:0.31268451 4:0.5167094 5:0.61224976 6:0.68477529 \n4 1:0.80595874 2:0.48885268 3:0.44075432 4:0.7803455 5:0.86328719 6:0.90222545 \n-4 1:0.7715192 2:0.48885268 3:0.4012577 4:0.70792536 5:0.80063653 6:0.85083872 \n4 1:0.82199966 2:0.20629643 3:0.30562098 4:0.80541317 5:0.89285836 6:0.92907353 \n-4 1:0.84774006 2:0.20629643 3:0.36755712 4:0.8681203 5:0.93297792 6:0.95700049 \n4 1:0.26631905 2:0.20629643 3:0.076468978 4:0.29833807 5:0.37989948 6:0.4576277 \n-4 1:0.65439648 2:0.20629643 3:0.19487894 4:0.63045155 5:0.76931142 6:0.83706632 \n4 1:0.55295603 2:0.20629643 3:0.13877412 4:0.4724047 5:0.59295828 6:0.66834832 \n4 1:0.75448924 2:0.20629643 3:0.24707248 4:0.72284103 5:0.83178838 6:0.88053503 \n4 1:0.83852041 2:0.15600331 3:0.1625414 4:0.81948421 5:0.90185357 6:0.9347395 \n4 1:0.85805266 2:0.15600331 3:0.19693206 4:0.86294641 5:0.92990351 6:0.95498998 \n-4 1:0.43384835 2:0.15600331 3:0.030541611 4:0.37279112 5:0.4588284 6:0.52004828 \n-4 1:0.72588966 2:0.48885268 3:0.35394597 4:0.61189191 5:0.70897304 6:0.77099691 \n-4 1:0.65865915 2:0.20629643 3:0.1796405 4:0.56432133 5:0.68049028 6:0.74616621 \n-4 1:0.53095193 2:0.15600331 3:0.046271684 4:0.4328793 5:0.5309142 6:0.59282089 \n-4 1:0.71891465 2:0.15600331 3:0.11085278 4:0.68794624 5:0.80350923 6:0.85660483 \n-4 1:0.68635753 2:0.15600331 3:0.091457045 4:0.60849701 5:0.72282659 6:0.78137183 \n-4 1:0.64162333 2:0.15600331 3:0.068820233 4:0.51732819 5:0.62198733 6:0.67977328 \n4 1:0.78395225 2:0.15600331 3:0.13401869 4:0.75274384 5:0.8506531 6:0.89321405 \n-4 1:0.75276337 2:0.15600331 3:0.11289462 4:0.67598462 5:0.78117168 6:0.83259364 \n-4 1:0.71345342 2:0.15600331 3:0.089218917 4:0.58797907 5:0.69284768 6:0.74971699 \n4 1:0.93500967 2:0.08765484 3:0.72226864 4:0.93291747 5:0.960644 6:0.97304054 \n4 1:0.95150668 2:0.08765484 3:0.77391346 4:0.95596295 5:0.97544784 6:0.98405871 \n-4 1:0.48148634 2:0.08765484 3:0.36628046 4:0.45852823 5:0.56005228 6:0.65708595 \n-4 1:0.59853216 2:0.08765484 3:0.42071301 4:0.56376512 5:0.66454599 6:0.741236 \n-4 1:0.79297271 2:0.08765484 3:0.5597726 4:0.80653689 5:0.88996341 6:0.92691132 \n-4 1:0.76798941 2:0.08765484 3:0.52069978 4:0.74484555 5:0.83431246 6:0.87935204 \n-4 1:0.73225133 2:0.08765484 3:0.47011786 4:0.66069877 5:0.75226598 6:0.80539407 \n-4 1:0.87240592 2:0.08765484 3:0.62680052 4:0.88208508 5:0.93041565 6:0.9505376 \n-4 1:0.84834872 2:0.08765484 3:0.58154998 4:0.82429855 5:0.8858516 6:0.91563291 \n-4 1:0.84365382 2:0.93973481 3:0.36718425 4:0.81512123 5:0.88887359 6:0.92320992 \n-4 1:0.89242364 2:0.93973481 3:0.41336953 4:0.88038833 5:0.93688884 6:0.95992879 \n-4 1:0.31373571 2:0.93973481 3:0.18757116 4:0.34864297 5:0.3777168 6:0.38922611 \n-4 1:0.42490775 2:0.93973481 3:0.20295859 4:0.39290035 5:0.43632323 6:0.45871216 \n-4 1:0.66865444 2:0.93973481 3:0.28594627 4:0.63969879 5:0.73360583 6:0.78380069 \n-4 1:0.62642524 2:0.93973481 3:0.26141889 4:0.56602175 5:0.64775366 6:0.69263211 \n-4 1:0.57430455 2:0.93973481 3:0.23537634 4:0.48984694 5:0.55363885 6:0.5853905 \n-4 1:0.76178555 2:0.93973481 3:0.32205372 4:0.7176044 5:0.80237787 6:0.84588741 \n-4 1:0.72282163 2:0.93973481 3:0.29554025 4:0.64471949 5:0.72634443 6:0.77062686 \n-4 1:0.67693861 2:0.93973481 3:0.2669659 4:0.56720118 5:0.63868728 6:0.67673331 \n4 1:0.86023804 2:0.49739676 3:0.53966638 4:0.77392585 5:0.84784447 6:0.89031641 \n1.296591709971377 1:0.31779385 2:0.49739676 3:0.17094319 4:0.12195679 5:0.13277563 6:0.14165413 \n4 1:0.68317784 2:0.49739676 3:0.37192301 4:0.52750491 5:0.62426522 6:0.6929947 \n4 1:0.55611181 2:0.49739676 3:0.24752355 4:0.28326524 5:0.33261781 6:0.37104424 \n4 1:0.7772257 2:0.49739676 3:0.43832146 4:0.63397606 5:0.7240692 6:0.78367237 \n4 1:0.66186286 2:0.49739676 3:0.30599867 4:0.39201262 5:0.45927759 6:0.51239284 \n4 1:0.94601776 2:0.04579546 3:0.69472114 4:0.97790884 5:0.9891237 6:0.993277 \n4 1:0.98838404 2:0.04579546 3:0.90293444 4:0.99181622 5:0.99642641 6:0.9978864 \n4 1:0.30006056 2:0.04579546 3:0.31879 4:0.45852885 5:0.59717781 6:0.71487885 \n-4 1:0.44902891 2:0.04579546 3:0.35412414 4:0.55926446 5:0.70175505 6:0.79649177 \n-4 1:0.69856222 2:0.04579546 3:0.45989947 4:0.82115248 5:0.92520734 6:0.9594384 \n-4 1:0.67730161 2:0.04579546 3:0.44400319 4:0.77920819 5:0.88713866 6:0.92903178 \n-4 1:0.64419192 2:0.04579546 3:0.42297435 4:0.72390263 5:0.83364665 6:0.88344569 \n-4 1:0.80781899 2:0.04579546 3:0.52334234 4:0.88859427 5:0.94013924 6:0.95946903 \n-4 1:0.78080761 2:0.04579546 3:0.499439 4:0.84012074 5:0.90229375 6:0.92936693 \n4 1:0.97128596 2:0.014623935 3:0.90135809 4:0.99584619 5:0.9970631 6:0.99757649 \n4 1:0.99645027 2:0.014623935 3:1 4:1 5:1 6:1 \n-4 1:0.5326065 2:0.014623935 3:0.75468972 4:0.76017077 5:0.83753774 6:0.92265059 \n-4 1:0.62757004 2:0.014623935 3:0.77708563 4:0.84258654 5:0.91016348 6:0.95440359 \n-4 1:0.79306842 2:0.014623935 3:0.78900741 4:0.90386551 5:0.96905764 6:0.98466408 \n-4 1:0.77722867 2:0.014623935 3:0.78701408 4:0.89679281 5:0.96056131 6:0.977629 \n-4 1:0.75934622 2:0.014623935 3:0.78422805 4:0.88268036 5:0.94383829 6:0.96596858 \n-4 1:0.8878718 2:0.014623935 3:0.81445984 4:0.96615706 5:0.98858241 6:0.99176534 \n-4 1:0.88211614 2:0.014623935 3:0.81253935 4:0.95982371 5:0.98309178 6:0.9870796 \n4 1:0.83805466 2:0.22767235 3:0.31750162 4:0.85145925 5:0.9121085 6:0.93772147 \n4 1:0.86620985 2:0.22767235 3:0.35742938 4:0.89821492 5:0.94339974 6:0.96076173 \n4 1:0.39289606 2:0.22767235 3:0.12019254 4:0.3951559 5:0.44657802 6:0.46771549 \n4 1:0.48692411 2:0.22767235 3:0.13362033 4:0.43434224 5:0.49900609 6:0.53177669 \n4 1:0.69743918 2:0.22767235 3:0.2263303 4:0.68859985 5:0.78706365 6:0.83662428 \n4 1:0.65237548 2:0.22767235 3:0.19328493 4:0.60107975 5:0.69684945 6:0.74949279 \n4 1:0.59461718 2:0.22767235 3:0.15963705 4:0.51010642 5:0.59283393 6:0.63883591 \n4 1:0.77302727 2:0.22767235 3:0.26078021 4:0.76359704 5:0.8470807 6:0.8858359 \n4 1:0.72953038 2:0.22767235 3:0.22331233 4:0.67735915 5:0.77029889 6:0.81802539 \n4 1:0.87210923 2:0.16787772 3:0.69408521 4:0.91495146 5:0.94890261 6:0.96269344 \n-4 1:0.81595959 2:0.08765484 3:0.52947327 4:0.7501341 5:0.82294191 6:0.86264385 \n4 1:0.72562415 2:0.49739676 3:0.37130724 4:0.51472366 5:0.59961357 6:0.66258291 \n-4 1:0.87135693 2:0.014623935 3:0.80905852 4:0.94637428 5:0.97242826 6:0.97946694 \n-4 1:0.48910215 2:0.16787772 3:0.49792761 4:0.59161372 5:0.62979552 6:0.64254584 \n-4 1:0.5685964 2:0.16787772 3:0.5149767 4:0.63026581 5:0.67890679 6:0.69964851 \n-4 1:0.75935478 2:0.16787772 3:0.60695536 4:0.80906778 5:0.87125816 6:0.89810007 \n-4 1:0.71788601 2:0.16787772 3:0.57600091 4:0.75310216 5:0.81471966 6:0.84249923 \n-4 1:0.66516668 2:0.16787772 3:0.54473368 4:0.69254626 5:0.74796983 6:0.77177867 \n4 1:0.81880869 2:0.16787772 3:0.64309172 4:0.86078024 5:0.90892223 6:0.92908907 \n-4 1:0.78054558 2:0.16787772 3:0.60849279 4:0.80724494 5:0.86183239 6:0.88618408 \n4 1:0.95353512 2:0.055921852 3:0.61526026 4:0.94655706 5:0.97211195 6:0.98210701 \n4 1:0.98368527 2:0.055921852 3:0.7405327 4:0.96928567 5:0.9853799 6:0.99080378 \n4 1:0.11318821 2:0.055921852 3:0.1590151 4:0.30536689 5:0.48614515 6:0.64344462 \n4 1:0.30298819 2:0.055921852 3:0.19401703 4:0.41679982 5:0.61495039 6:0.74140301 \n4 1:0.60614412 2:0.055921852 3:0.31791569 4:0.72365433 5:0.88324129 6:0.93484545 \n4 1:0.58738733 2:0.055921852 3:0.29301498 4:0.67070014 5:0.83429953 6:0.89348041 \n4 1:0.79496816 2:0.055921852 3:0.42192974 4:0.86711004 5:0.94030868 6:0.96084539 \n4 1:0.77749763 2:0.055921852 3:0.38714172 4:0.81340799 5:0.90059649 6:0.93006702 \n4 1:0.75215882 2:0.055921852 3:0.34721658 4:0.73960747 5:0.84370247 6:0.88485372 \n4 1:0.89732805 2:0.58937038 3:0.58823535 4:0.80035053 5:0.86988422 6:0.90533033 \n-4 1:0.9228759 2:0.58937038 3:0.65797705 4:0.87169952 5:0.92200942 6:0.94454256 \n4 1:0.19504362 2:0.58937038 3:0.21585801 4:0.1754362 5:0.20844015 6:0.23846443 \n4 1:0.34425894 2:0.58937038 3:0.24672569 4:0.24188506 5:0.29544562 6:0.33843061 \n4 1:0.66407117 2:0.58937038 3:0.40045124 4:0.55415203 5:0.66628031 6:0.73418465 \n4 1:0.60780044 2:0.58937038 3:0.34931828 4:0.4519606 5:0.54893247 6:0.61355219 \n4 1:0.53476258 2:0.58937038 3:0.29851601 4:0.34826788 5:0.42168642 6:0.47203603 \n4 1:0.79195776 2:0.58937038 3:0.47493233 4:0.66775916 5:0.76196439 6:0.81489875 \n4 1:0.7415564 2:0.58937038 3:0.41507439 4:0.56413083 5:0.65815516 6:0.7166999 \n4 1:0.82021207 2:1 3:0.37381485 4:0.7891612 5:0.87031145 6:0.90944281 \n-3.795805084530972 1:0.85903236 2:1 3:0.43235998 4:0.86707094 5:0.92632217 6:0.95151451 \n-4 1:0.25243046 2:1 3:0.084027451 4:0.15537936 5:0.17410072 6:0.17212333 \n-4 1:0.35643487 2:1 3:0.10644455 4:0.21484368 5:0.25587544 6:0.27527817 \n-4 1:0.57605414 2:1 3:0.19031962 4:0.43030863 5:0.5277316 6:0.59069772 \n-4 1:0.49071444 2:1 3:0.14452095 4:0.31406915 5:0.38353445 6:0.42653517 \n4 1:0.73255545 2:1 3:0.28883701 4:0.65284485 5:0.75623242 6:0.81297442 \n0.4082706381617505 1:0.67015395 2:1 3:0.2367756 4:0.5367057 5:0.64063877 6:0.70451767 \n-4 1:0.84450653 2:0.083369236 3:0.57279245 4:0.85249389 5:0.91751611 6:0.94621989 \n-4 1:0.39559773 2:0.083369236 3:0.28184137 4:0.37025203 5:0.46733936 6:0.53517338 \n-4 1:0.70621493 2:0.083369236 3:0.42718441 4:0.69347659 5:0.81124449 6:0.87136343 \n-4 1:0.65615861 2:0.083369236 3:0.37833052 4:0.59301482 5:0.71772587 6:0.7905538 \n-4 1:0.58837863 2:0.083369236 3:0.33229353 4:0.48675881 5:0.60141743 6:0.67458413 \n-4 1:0.77687144 2:0.083369236 3:0.48094343 4:0.76665994 5:0.86191893 6:0.90760934 \n-1.966116876631112 1:0.72849768 2:0.083369236 3:0.42082971 4:0.66591147 5:0.77995959 6:0.84260661 \n-3.906831378063804 1:0.66320082 2:0.083369236 3:0.36350305 4:0.54888271 5:0.66506794 6:0.73685112 \n4 1:0.84500499 2:0.42532178 3:0.43562507 4:0.80721931 5:0.87934044 6:0.91434143 \n4 1:0.8874543 2:0.42532178 3:0.50912639 4:0.87959883 5:0.93223488 6:0.95450335 \n4 1:0.31032192 2:0.42532178 3:0.18976794 4:0.30662908 5:0.34637104 6:0.3661022 \n4 1:0.41026349 2:0.42532178 3:0.20589097 4:0.35241209 5:0.40358156 6:0.42577381 \n4 1:0.67552108 2:0.42532178 3:0.30879992 4:0.60375124 5:0.70097073 6:0.75507206 \n4 1:0.62772585 2:0.42532178 3:0.27349745 4:0.5196735 5:0.60339149 6:0.65103342 \n4 1:0.5741386 2:0.42532178 3:0.24033766 4:0.43855753 5:0.50243186 6:0.53322825 \n4 1:0.7629976 2:0.42532178 3:0.35347476 4:0.69239941 5:0.78245146 6:0.83117443 \n4 1:0.71746409 2:0.42532178 3:0.31296983 4:0.60525302 5:0.69243388 6:0.7432587 \n-4 1:0.73137955 2:0.16787772 3:0.57222383 4:0.74405775 5:0.79993424 6:0.82484891 \n4 1:0.67383121 2:0.58937038 3:0.35481019 4:0.45269287 5:0.53578336 6:0.59116487 \n4 1:0.5905971 2:1 3:0.18559792 4:0.41535212 5:0.50422336 6:0.56173557 \n4 1:0.66157018 2:0.42532178 3:0.27479904 4:0.51802649 5:0.59270541 6:0.63560969 \n-4 1:0.66827754 2:0.54342577 3:0.18169339 4:0.50290989 5:0.59875259 6:0.65332628 \n4 1:0.85027066 2:0.20820673 3:0.40997978 4:0.82462749 5:0.89794736 6:0.93142825 \n4 1:0.87892054 2:0.20820673 3:0.45891267 4:0.87823329 5:0.93535353 6:0.95883927 \n4 1:0.3986268 2:0.20820673 3:0.17753958 4:0.33495583 5:0.39777832 6:0.44399359 \n-4 1:0.48997993 2:0.20820673 3:0.20172681 4:0.39715881 5:0.47368229 6:0.52781628 \n4 1:0.7022939 2:0.20820673 3:0.31094767 4:0.6676259 5:0.77726116 6:0.83518027 \n-4 1:0.65773092 2:0.20820673 3:0.27420721 4:0.57889989 5:0.68485118 6:0.74837036 \n0.2951376518668717 1:0.60031736 2:0.20820673 3:0.23419121 4:0.48018865 5:0.57200972 6:0.63197473 \n4 1:0.77623676 2:0.20820673 3:0.3510016 4:0.74206651 5:0.83508543 6:0.88101902 \n4 1:0.73562396 2:0.20820673 3:0.31004997 4:0.6557112 5:0.75585014 6:0.81164989 \n-4 1:0.67923081 2:0.20820673 3:0.26679137 4:0.55816547 5:0.65579282 6:0.71593631 \n4 1:0.83968539 2:0.54342577 3:0.32439292 4:0.78747769 5:0.87303614 6:0.91271252 \n4 1:0.86656342 2:0.54342577 3:0.37898741 4:0.85252726 5:0.92049615 6:0.94848246 \n-4 1:0.42728303 2:0.54342577 3:0.10123262 4:0.31581962 5:0.38571265 6:0.42827036 \n-4 1:0.63194526 2:0.54342577 3:0.18169045 4:0.51611903 5:0.62179755 6:0.68216176 \n4 1:0.56954706 2:0.54342577 3:0.14271477 4:0.41491191 5:0.50173488 6:0.55220392 \n4 1:0.76753176 2:0.54342577 3:0.26295318 4:0.6905031 5:0.79291823 6:0.84469464 \n-4 1:0.72348649 2:0.54342577 3:0.22334634 4:0.60145902 5:0.70573225 6:0.76318544 \n4 1:0.83584492 2:0.047285912 3:0.53826775 4:0.933335 5:0.95948954 6:0.96870909 \n4 1:0.85530855 2:0.047285912 3:0.55323777 4:0.95113339 5:0.97249918 6:0.9795177 \n-4 1:0.53835734 2:0.047285912 3:0.41965074 4:0.71632669 5:0.73953043 6:0.73487553 \n-4 1:0.59175144 2:0.047285912 3:0.43113594 4:0.74141738 5:0.76929188 6:0.77018949 \n-4 1:0.75962366 2:0.047285912 3:0.49613729 4:0.87838146 5:0.91688438 6:0.93150362 \n-4 1:0.72043129 2:0.047285912 3:0.47217411 4:0.83138845 5:0.8704229 6:0.88419439 \n-4 1:0.67287449 2:0.047285912 3:0.44652268 4:0.77691812 5:0.81043483 6:0.8177009 \n-4 1:0.8023177 2:0.047285912 3:0.51559706 4:0.90512389 5:0.93743101 6:0.9492968 \n-4 1:0.76751376 2:0.047285912 3:0.49225957 4:0.86357299 5:0.89948127 6:0.91221155 \n-4 1:0.72124785 2:0.047285912 3:0.46606653 4:0.81323145 5:0.84847474 6:0.85892657 \n",
"feature_dict": {
"VMAF_integer_feature": [
"VMAF_feature": [
"vif_scale0",
"vif_scale1",
"vif_scale2",
......
......@@ -29,12 +29,12 @@
100.0
],
"feature_names": [
"VMAF_feature_adm2_score",
"VMAF_feature_motion2_score",
"VMAF_feature_vif_scale0_score",
"VMAF_feature_vif_scale1_score",
"VMAF_feature_vif_scale2_score",
"VMAF_feature_vif_scale3_score"
"VMAF_integer_feature_adm2_score",
"VMAF_integer_feature_motion2_score",
"VMAF_integer_feature_vif_scale0_score",
"VMAF_integer_feature_vif_scale1_score",
"VMAF_integer_feature_vif_scale2_score",
"VMAF_integer_feature_vif_scale3_score"
],
"intercepts": [
-0.3092981927591963,
......@@ -56,7 +56,7 @@
2.08656468286432
],
"feature_dict": {
"VMAF_feature": [
"VMAF_integer_feature": [
"vif_scale0",
"vif_scale1",
"vif_scale2",
......
......@@ -55,12 +55,12 @@
100.0
],
"feature_names": [
"VMAF_feature_adm2_score",
"VMAF_feature_motion2_score",
"VMAF_feature_vif_scale0_score",
"VMAF_feature_vif_scale1_score",
"VMAF_feature_vif_scale2_score",
"VMAF_feature_vif_scale3_score"
"VMAF_integer_feature_adm2_score",
"VMAF_integer_feature_motion2_score",
"VMAF_integer_feature_vif_scale0_score",
"VMAF_integer_feature_vif_scale1_score",
"VMAF_integer_feature_vif_scale2_score",
"VMAF_integer_feature_vif_scale3_score"
],
"intercepts": [
-0.3092981927591963,
......@@ -74,7 +74,7 @@
"model_type": "LIBSVMNUSVR",
"model": "svm_type nu_svr\nkernel_type rbf\ngamma 0.04\nnr_class 2\ntotal_sv 211\nrho -1.33133\nSV\n-4 1:0.65734273 2:0.34681232 3:0.093755557 4:0.60913934 5:0.69117362 6:0.73495824 \n4 1:0.8727433 2:0.49612229 3:0.59146724 4:0.78105663 5:0.84916292 6:0.8882561 \n4 1:0.89890005 2:0.49612229 3:0.66823667 4:0.86050887 5:0.90873162 6:0.93335071 \n4 1:0.20371751 2:0.49612229 3:0.10534315 4:-1.110223e-16 6:2.220446e-16 \n4 1:0.33913836 2:0.49612229 3:0.14024497 4:0.074708413 5:0.10231651 6:0.1259153 \n4 1:0.66426757 2:0.49612229 3:0.35268026 4:0.4805681 5:0.59603341 6:0.67408692 \n4 1:0.59561632 2:0.49612229 3:0.27561601 4:0.33977371 5:0.4325213 6:0.50244952 \n4 1:0.50821444 2:0.49612229 3:0.20276685 4:0.2004308 5:0.25758651 6:0.30054029 \n4 1:0.77877298 2:0.49612229 3:0.444392 4:0.61630491 5:0.71210086 6:0.77386496 \n4 1:0.71666017 2:0.49612229 3:0.35967401 4:0.47825205 5:0.57045236 6:0.63752441 \n4 1:0.64025669 2:0.49612229 3:0.27766156 4:0.33407105 5:0.40732401 6:0.46359154 \n4 1:0.88343983 2:0.23066177 3:0.65873851 4:0.86090402 5:0.90661213 6:0.93008753 \n4 1:0.90822691 2:0.23066177 3:0.71439481 4:0.90904598 5:0.94146542 6:0.95674338 \n-4 1:0.49037399 2:0.23066177 3:0.32329421 4:0.33686197 5:0.39456977 6:0.44944683 \n-4 1:0.69044383 2:0.23066177 3:0.43933868 4:0.56327049 5:0.65339511 6:0.71348696 \n-4 1:0.62390093 2:0.23066177 3:0.3800888 4:0.44927578 5:0.52327759 6:0.57907725 \n4 1:0.81887942 2:0.23066177 3:0.56208506 4:0.76164281 5:0.83176644 6:0.86914911 \n4 1:0.77189471 2:0.23066177 3:0.50145055 4:0.66525882 5:0.74327951 6:0.79017822 \n4 1:0.71405433 2:0.23066177 3:0.43952897 4:0.55736023 5:0.63319876 6:0.68402869 \n4 1:0.92114073 3:0.45198963 4:0.97703695 5:0.9907273 6:0.99510256 \n4 1:1 3:0.83319067 4:0.98956086 5:0.99577089 6:0.99784595 \n4 4:0.10344019 5:0.34323945 6:0.63855969 \n4 1:0.19531482 3:0.034330388 4:0.25480402 5:0.54197045 6:0.78020579 \n4 1:0.48394064 3:0.11866359 4:0.58816959 5:0.86435738 6:0.96191842 \n4 1:0.47628079 3:0.11185039 4:0.56180003 5:0.83415721 6:0.93617329 \n4 1:0.46278632 3:0.10308547 4:0.52247575 5:0.78583924 6:0.89392193 \n4 1:0.7038079 3:0.2174879 4:0.84423613 5:0.9662906 6:0.98430594 \n4 1:0.69596686 3:0.20657211 4:0.81196884 5:0.94140702 6:0.96680805 \n4 1:0.68404358 3:0.19261438 4:0.76066415 5:0.89973293 6:0.93660362 \n4 1:0.84073022 2:0.34681232 3:0.22411304 4:0.88845644 5:0.94169671 6:0.96221395 \n-4 1:0.33900937 2:0.34681232 3:0.027607294 4:0.40659646 5:0.45456869 6:0.48256597 \n-4 1:0.44593129 2:0.34681232 3:0.041939301 4:0.45284872 5:0.5157613 6:0.55335821 \n-4 1:0.67301747 2:0.34681232 3:0.11526222 4:0.68549511 5:0.78556255 6:0.83507583 \n-4 1:0.62833533 2:0.34681232 3:0.092281981 4:0.61278125 5:0.70626575 6:0.75613977 \n-4 1:0.57196879 2:0.34681232 3:0.067548447 4:0.53383404 5:0.61287548 6:0.65468717 \n-0.3312466607741135 1:0.75125028 2:0.34681232 3:0.1457048 4:0.75791308 5:0.84155109 6:0.88132116 \n-4 1:0.71121936 2:0.34681232 3:0.12095689 4:0.68834617 5:0.77453583 6:0.81892861 \n-4 1:0.80269544 2:0.25207203 3:0.3681723 4:0.80658472 5:0.8702283 6:0.90583519 \n-4 1:0.86095387 2:0.25207203 3:0.52475418 4:0.85053413 5:0.90454501 6:0.93093678 \n-4 1:0.5008963 2:0.25207203 3:0.2005129 4:0.41516485 5:0.45282017 6:0.47396143 \n-4 1:0.56977992 2:0.25207203 3:0.21631076 4:0.45848604 5:0.51102137 6:0.53823055 \n-4 1:0.72779828 2:0.25207203 3:0.3051639 4:0.67537297 5:0.75767261 6:0.80327187 \n-4 1:0.68848569 2:0.25207203 3:0.27393051 4:0.60399854 5:0.68000038 6:0.72275152 \n-4 1:0.64121401 2:0.25207203 3:0.23994344 4:0.52538719 5:0.5891732 6:0.62164073 \n-4 1:0.76673633 2:0.25207203 3:0.33053889 4:0.73085549 5:0.80341439 6:0.84546456 \n-4 1:0.73041172 2:0.25207203 3:0.29691153 4:0.66166141 5:0.73408074 6:0.77757209 \n-4 1:0.68529047 2:0.25207203 3:0.26283557 4:0.58611788 5:0.65192525 6:0.69015011 \n4 1:0.86902267 2:0.48885268 3:0.5143645 4:0.8587242 5:0.91841685 6:0.94498293 \n4 1:0.89266106 2:0.48885268 3:0.55208861 4:0.89938377 5:0.94642982 6:0.96615102 \n-4 1:0.42554844 2:0.48885268 3:0.2554221 4:0.36916892 5:0.43100226 6:0.50888404 \n-4 1:0.52520274 2:0.48885268 3:0.27824915 4:0.42915458 5:0.50850476 6:0.58585271 \n-4 1:0.69357445 2:0.48885268 3:0.35289928 4:0.61359907 5:0.7217863 6:0.78790011 \n-4 1:0.64679648 2:0.48885268 3:0.31268451 4:0.5167094 5:0.61224976 6:0.68477529 \n4 1:0.80595874 2:0.48885268 3:0.44075432 4:0.7803455 5:0.86328719 6:0.90222545 \n-4 1:0.7715192 2:0.48885268 3:0.4012577 4:0.70792536 5:0.80063653 6:0.85083872 \n4 1:0.82199966 2:0.20629643 3:0.30562098 4:0.80541317 5:0.89285836 6:0.92907353 \n-4 1:0.84774006 2:0.20629643 3:0.36755712 4:0.8681203 5:0.93297792 6:0.95700049 \n4 1:0.26631905 2:0.20629643 3:0.076468978 4:0.29833807 5:0.37989948 6:0.4576277 \n-4 1:0.65439648 2:0.20629643 3:0.19487894 4:0.63045155 5:0.76931142 6:0.83706632 \n4 1:0.55295603 2:0.20629643 3:0.13877412 4:0.4724047 5:0.59295828 6:0.66834832 \n4 1:0.75448924 2:0.20629643 3:0.24707248 4:0.72284103 5:0.83178838 6:0.88053503 \n4 1:0.83852041 2:0.15600331 3:0.1625414 4:0.81948421 5:0.90185357 6:0.9347395 \n4 1:0.85805266 2:0.15600331 3:0.19693206 4:0.86294641 5:0.92990351 6:0.95498998 \n-4 1:0.43384835 2:0.15600331 3:0.030541611 4:0.37279112 5:0.4588284 6:0.52004828 \n-4 1:0.72588966 2:0.48885268 3:0.35394597 4:0.61189191 5:0.70897304 6:0.77099691 \n-4 1:0.65865915 2:0.20629643 3:0.1796405 4:0.56432133 5:0.68049028 6:0.74616621 \n-4 1:0.53095193 2:0.15600331 3:0.046271684 4:0.4328793 5:0.5309142 6:0.59282089 \n-4 1:0.71891465 2:0.15600331 3:0.11085278 4:0.68794624 5:0.80350923 6:0.85660483 \n-4 1:0.68635753 2:0.15600331 3:0.091457045 4:0.60849701 5:0.72282659 6:0.78137183 \n-4 1:0.64162333 2:0.15600331 3:0.068820233 4:0.51732819 5:0.62198733 6:0.67977328 \n4 1:0.78395225 2:0.15600331 3:0.13401869 4:0.75274384 5:0.8506531 6:0.89321405 \n-4 1:0.75276337 2:0.15600331 3:0.11289462 4:0.67598462 5:0.78117168 6:0.83259364 \n-4 1:0.71345342 2:0.15600331 3:0.089218917 4:0.58797907 5:0.69284768 6:0.74971699 \n4 1:0.93500967 2:0.08765484 3:0.72226864 4:0.93291747 5:0.960644 6:0.97304054 \n4 1:0.95150668 2:0.08765484 3:0.77391346 4:0.95596295 5:0.97544784 6:0.98405871 \n-4 1:0.48148634 2:0.08765484 3:0.36628046 4:0.45852823 5:0.56005228 6:0.65708595 \n-4 1:0.59853216 2:0.08765484 3:0.42071301 4:0.56376512 5:0.66454599 6:0.741236 \n-4 1:0.79297271 2:0.08765484 3:0.5597726 4:0.80653689 5:0.88996341 6:0.92691132 \n-4 1:0.76798941 2:0.08765484 3:0.52069978 4:0.74484555 5:0.83431246 6:0.87935204 \n-4 1:0.73225133 2:0.08765484 3:0.47011786 4:0.66069877 5:0.75226598 6:0.80539407 \n-4 1:0.87240592 2:0.08765484 3:0.62680052 4:0.88208508 5:0.93041565 6:0.9505376 \n-4 1:0.84834872 2:0.08765484 3:0.58154998 4:0.82429855 5:0.8858516 6:0.91563291 \n-4 1:0.84365382 2:0.93973481 3:0.36718425 4:0.81512123 5:0.88887359 6:0.92320992 \n-4 1:0.89242364 2:0.93973481 3:0.41336953 4:0.88038833 5:0.93688884 6:0.95992879 \n-4 1:0.31373571 2:0.93973481 3:0.18757116 4:0.34864297 5:0.3777168 6:0.38922611 \n-4 1:0.42490775 2:0.93973481 3:0.20295859 4:0.39290035 5:0.43632323 6:0.45871216 \n-4 1:0.66865444 2:0.93973481 3:0.28594627 4:0.63969879 5:0.73360583 6:0.78380069 \n-4 1:0.62642524 2:0.93973481 3:0.26141889 4:0.56602175 5:0.64775366 6:0.69263211 \n-4 1:0.57430455 2:0.93973481 3:0.23537634 4:0.48984694 5:0.55363885 6:0.5853905 \n-4 1:0.76178555 2:0.93973481 3:0.32205372 4:0.7176044 5:0.80237787 6:0.84588741 \n-4 1:0.72282163 2:0.93973481 3:0.29554025 4:0.64471949 5:0.72634443 6:0.77062686 \n-4 1:0.67693861 2:0.93973481 3:0.2669659 4:0.56720118 5:0.63868728 6:0.67673331 \n4 1:0.86023804 2:0.49739676 3:0.53966638 4:0.77392585 5:0.84784447 6:0.89031641 \n1.296591709971377 1:0.31779385 2:0.49739676 3:0.17094319 4:0.12195679 5:0.13277563 6:0.14165413 \n4 1:0.68317784 2:0.49739676 3:0.37192301 4:0.52750491 5:0.62426522 6:0.6929947 \n4 1:0.55611181 2:0.49739676 3:0.24752355 4:0.28326524 5:0.33261781 6:0.37104424 \n4 1:0.7772257 2:0.49739676 3:0.43832146 4:0.63397606 5:0.7240692 6:0.78367237 \n4 1:0.66186286 2:0.49739676 3:0.30599867 4:0.39201262 5:0.45927759 6:0.51239284 \n4 1:0.94601776 2:0.04579546 3:0.69472114 4:0.97790884 5:0.9891237 6:0.993277 \n4 1:0.98838404 2:0.04579546 3:0.90293444 4:0.99181622 5:0.99642641 6:0.9978864 \n4 1:0.30006056 2:0.04579546 3:0.31879 4:0.45852885 5:0.59717781 6:0.71487885 \n-4 1:0.44902891 2:0.04579546 3:0.35412414 4:0.55926446 5:0.70175505 6:0.79649177 \n-4 1:0.69856222 2:0.04579546 3:0.45989947 4:0.82115248 5:0.92520734 6:0.9594384 \n-4 1:0.67730161 2:0.04579546 3:0.44400319 4:0.77920819 5:0.88713866 6:0.92903178 \n-4 1:0.64419192 2:0.04579546 3:0.42297435 4:0.72390263 5:0.83364665 6:0.88344569 \n-4 1:0.80781899 2:0.04579546 3:0.52334234 4:0.88859427 5:0.94013924 6:0.95946903 \n-4 1:0.78080761 2:0.04579546 3:0.499439 4:0.84012074 5:0.90229375 6:0.92936693 \n4 1:0.97128596 2:0.014623935 3:0.90135809 4:0.99584619 5:0.9970631 6:0.99757649 \n4 1:0.99645027 2:0.014623935 3:1 4:1 5:1 6:1 \n-4 1:0.5326065 2:0.014623935 3:0.75468972 4:0.76017077 5:0.83753774 6:0.92265059 \n-4 1:0.62757004 2:0.014623935 3:0.77708563 4:0.84258654 5:0.91016348 6:0.95440359 \n-4 1:0.79306842 2:0.014623935 3:0.78900741 4:0.90386551 5:0.96905764 6:0.98466408 \n-4 1:0.77722867 2:0.014623935 3:0.78701408 4:0.89679281 5:0.96056131 6:0.977629 \n-4 1:0.75934622 2:0.014623935 3:0.78422805 4:0.88268036 5:0.94383829 6:0.96596858 \n-4 1:0.8878718 2:0.014623935 3:0.81445984 4:0.96615706 5:0.98858241 6:0.99176534 \n-4 1:0.88211614 2:0.014623935 3:0.81253935 4:0.95982371 5:0.98309178 6:0.9870796 \n4 1:0.83805466 2:0.22767235 3:0.31750162 4:0.85145925 5:0.9121085 6:0.93772147 \n4 1:0.86620985 2:0.22767235 3:0.35742938 4:0.89821492 5:0.94339974 6:0.96076173 \n4 1:0.39289606 2:0.22767235 3:0.12019254 4:0.3951559 5:0.44657802 6:0.46771549 \n4 1:0.48692411 2:0.22767235 3:0.13362033 4:0.43434224 5:0.49900609 6:0.53177669 \n4 1:0.69743918 2:0.22767235 3:0.2263303 4:0.68859985 5:0.78706365 6:0.83662428 \n4 1:0.65237548 2:0.22767235 3:0.19328493 4:0.60107975 5:0.69684945 6:0.74949279 \n4 1:0.59461718 2:0.22767235 3:0.15963705 4:0.51010642 5:0.59283393 6:0.63883591 \n4 1:0.77302727 2:0.22767235 3:0.26078021 4:0.76359704 5:0.8470807 6:0.8858359 \n4 1:0.72953038 2:0.22767235 3:0.22331233 4:0.67735915 5:0.77029889 6:0.81802539 \n4 1:0.87210923 2:0.16787772 3:0.69408521 4:0.91495146 5:0.94890261 6:0.96269344 \n-4 1:0.81595959 2:0.08765484 3:0.52947327 4:0.7501341 5:0.82294191 6:0.86264385 \n4 1:0.72562415 2:0.49739676 3:0.37130724 4:0.51472366 5:0.59961357 6:0.66258291 \n-4 1:0.87135693 2:0.014623935 3:0.80905852 4:0.94637428 5:0.97242826 6:0.97946694 \n-4 1:0.48910215 2:0.16787772 3:0.49792761 4:0.59161372 5:0.62979552 6:0.64254584 \n-4 1:0.5685964 2:0.16787772 3:0.5149767 4:0.63026581 5:0.67890679 6:0.69964851 \n-4 1:0.75935478 2:0.16787772 3:0.60695536 4:0.80906778 5:0.87125816 6:0.89810007 \n-4 1:0.71788601 2:0.16787772 3:0.57600091 4:0.75310216 5:0.81471966 6:0.84249923 \n-4 1:0.66516668 2:0.16787772 3:0.54473368 4:0.69254626 5:0.74796983 6:0.77177867 \n4 1:0.81880869 2:0.16787772 3:0.64309172 4:0.86078024 5:0.90892223 6:0.92908907 \n-4 1:0.78054558 2:0.16787772 3:0.60849279 4:0.80724494 5:0.86183239 6:0.88618408 \n4 1:0.95353512 2:0.055921852 3:0.61526026 4:0.94655706 5:0.97211195 6:0.98210701 \n4 1:0.98368527 2:0.055921852 3:0.7405327 4:0.96928567 5:0.9853799 6:0.99080378 \n4 1:0.11318821 2:0.055921852 3:0.1590151 4:0.30536689 5:0.48614515 6:0.64344462 \n4 1:0.30298819 2:0.055921852 3:0.19401703 4:0.41679982 5:0.61495039 6:0.74140301 \n4 1:0.60614412 2:0.055921852 3:0.31791569 4:0.72365433 5:0.88324129 6:0.93484545 \n4 1:0.58738733 2:0.055921852 3:0.29301498 4:0.67070014 5:0.83429953 6:0.89348041 \n4 1:0.79496816 2:0.055921852 3:0.42192974 4:0.86711004 5:0.94030868 6:0.96084539 \n4 1:0.77749763 2:0.055921852 3:0.38714172 4:0.81340799 5:0.90059649 6:0.93006702 \n4 1:0.75215882 2:0.055921852 3:0.34721658 4:0.73960747 5:0.84370247 6:0.88485372 \n4 1:0.89732805 2:0.58937038 3:0.58823535 4:0.80035053 5:0.86988422 6:0.90533033 \n-4 1:0.9228759 2:0.58937038 3:0.65797705 4:0.87169952 5:0.92200942 6:0.94454256 \n4 1:0.19504362 2:0.58937038 3:0.21585801 4:0.1754362 5:0.20844015 6:0.23846443 \n4 1:0.34425894 2:0.58937038 3:0.24672569 4:0.24188506 5:0.29544562 6:0.33843061 \n4 1:0.66407117 2:0.58937038 3:0.40045124 4:0.55415203 5:0.66628031 6:0.73418465 \n4 1:0.60780044 2:0.58937038 3:0.34931828 4:0.4519606 5:0.54893247 6:0.61355219 \n4 1:0.53476258 2:0.58937038 3:0.29851601 4:0.34826788 5:0.42168642 6:0.47203603 \n4 1:0.79195776 2:0.58937038 3:0.47493233 4:0.66775916 5:0.76196439 6:0.81489875 \n4 1:0.7415564 2:0.58937038 3:0.41507439 4:0.56413083 5:0.65815516 6:0.7166999 \n4 1:0.82021207 2:1 3:0.37381485 4:0.7891612 5:0.87031145 6:0.90944281 \n-3.795805084530972 1:0.85903236 2:1 3:0.43235998 4:0.86707094 5:0.92632217 6:0.95151451 \n-4 1:0.25243046 2:1 3:0.084027451 4:0.15537936 5:0.17410072 6:0.17212333 \n-4 1:0.35643487 2:1 3:0.10644455 4:0.21484368 5:0.25587544 6:0.27527817 \n-4 1:0.57605414 2:1 3:0.19031962 4:0.43030863 5:0.5277316 6:0.59069772 \n-4 1:0.49071444 2:1 3:0.14452095 4:0.31406915 5:0.38353445 6:0.42653517 \n4 1:0.73255545 2:1 3:0.28883701 4:0.65284485 5:0.75623242 6:0.81297442 \n0.4082706381617505 1:0.67015395 2:1 3:0.2367756 4:0.5367057 5:0.64063877 6:0.70451767 \n-4 1:0.84450653 2:0.083369236 3:0.57279245 4:0.85249389 5:0.91751611 6:0.94621989 \n-4 1:0.39559773 2:0.083369236 3:0.28184137 4:0.37025203 5:0.46733936 6:0.53517338 \n-4 1:0.70621493 2:0.083369236 3:0.42718441 4:0.69347659 5:0.81124449 6:0.87136343 \n-4 1:0.65615861 2:0.083369236 3:0.37833052 4:0.59301482 5:0.71772587 6:0.7905538 \n-4 1:0.58837863 2:0.083369236 3:0.33229353 4:0.48675881 5:0.60141743 6:0.67458413 \n-4 1:0.77687144 2:0.083369236 3:0.48094343 4:0.76665994 5:0.86191893 6:0.90760934 \n-1.966116876631112 1:0.72849768 2:0.083369236 3:0.42082971 4:0.66591147 5:0.77995959 6:0.84260661 \n-3.906831378063804 1:0.66320082 2:0.083369236 3:0.36350305 4:0.54888271 5:0.66506794 6:0.73685112 \n4 1:0.84500499 2:0.42532178 3:0.43562507 4:0.80721931 5:0.87934044 6:0.91434143 \n4 1:0.8874543 2:0.42532178 3:0.50912639 4:0.87959883 5:0.93223488 6:0.95450335 \n4 1:0.31032192 2:0.42532178 3:0.18976794 4:0.30662908 5:0.34637104 6:0.3661022 \n4 1:0.41026349 2:0.42532178 3:0.20589097 4:0.35241209 5:0.40358156 6:0.42577381 \n4 1:0.67552108 2:0.42532178 3:0.30879992 4:0.60375124 5:0.70097073 6:0.75507206 \n4 1:0.62772585 2:0.42532178 3:0.27349745 4:0.5196735 5:0.60339149 6:0.65103342 \n4 1:0.5741386 2:0.42532178 3:0.24033766 4:0.43855753 5:0.50243186 6:0.53322825 \n4 1:0.7629976 2:0.42532178 3:0.35347476 4:0.69239941 5:0.78245146 6:0.83117443 \n4 1:0.71746409 2:0.42532178 3:0.31296983 4:0.60525302 5:0.69243388 6:0.7432587 \n-4 1:0.73137955 2:0.16787772 3:0.57222383 4:0.74405775 5:0.79993424 6:0.82484891 \n4 1:0.67383121 2:0.58937038 3:0.35481019 4:0.45269287 5:0.53578336 6:0.59116487 \n4 1:0.5905971 2:1 3:0.18559792 4:0.41535212 5:0.50422336 6:0.56173557 \n4 1:0.66157018 2:0.42532178 3:0.27479904 4:0.51802649 5:0.59270541 6:0.63560969 \n-4 1:0.66827754 2:0.54342577 3:0.18169339 4:0.50290989 5:0.59875259 6:0.65332628 \n4 1:0.85027066 2:0.20820673 3:0.40997978 4:0.82462749 5:0.89794736 6:0.93142825 \n4 1:0.87892054 2:0.20820673 3:0.45891267 4:0.87823329 5:0.93535353 6:0.95883927 \n4 1:0.3986268 2:0.20820673 3:0.17753958 4:0.33495583 5:0.39777832 6:0.44399359 \n-4 1:0.48997993 2:0.20820673 3:0.20172681 4:0.39715881 5:0.47368229 6:0.52781628 \n4 1:0.7022939 2:0.20820673 3:0.31094767 4:0.6676259 5:0.77726116 6:0.83518027 \n-4 1:0.65773092 2:0.20820673 3:0.27420721 4:0.57889989 5:0.68485118 6:0.74837036 \n0.2951376518668717 1:0.60031736 2:0.20820673 3:0.23419121 4:0.48018865 5:0.57200972 6:0.63197473 \n4 1:0.77623676 2:0.20820673 3:0.3510016 4:0.74206651 5:0.83508543 6:0.88101902 \n4 1:0.73562396 2:0.20820673 3:0.31004997 4:0.6557112 5:0.75585014 6:0.81164989 \n-4 1:0.67923081 2:0.20820673 3:0.26679137 4:0.55816547 5:0.65579282 6:0.71593631 \n4 1:0.83968539 2:0.54342577 3:0.32439292 4:0.78747769 5:0.87303614 6:0.91271252 \n4 1:0.86656342 2:0.54342577 3:0.37898741 4:0.85252726 5:0.92049615 6:0.94848246 \n-4 1:0.42728303 2:0.54342577 3:0.10123262 4:0.31581962 5:0.38571265 6:0.42827036 \n-4 1:0.63194526 2:0.54342577 3:0.18169045 4:0.51611903 5:0.62179755 6:0.68216176 \n4 1:0.56954706 2:0.54342577 3:0.14271477 4:0.41491191 5:0.50173488 6:0.55220392 \n4 1:0.76753176 2:0.54342577 3:0.26295318 4:0.6905031 5:0.79291823 6:0.84469464 \n-4 1:0.72348649 2:0.54342577 3:0.22334634 4:0.60145902 5:0.70573225 6:0.76318544 \n4 1:0.83584492 2:0.047285912 3:0.53826775 4:0.933335 5:0.95948954 6:0.96870909 \n4 1:0.85530855 2:0.047285912 3:0.55323777 4:0.95113339 5:0.97249918 6:0.9795177 \n-4 1:0.53835734 2:0.047285912 3:0.41965074 4:0.71632669 5:0.73953043 6:0.73487553 \n-4 1:0.59175144 2:0.047285912 3:0.43113594 4:0.74141738 5:0.76929188 6:0.77018949 \n-4 1:0.75962366 2:0.047285912 3:0.49613729 4:0.87838146 5:0.91688438 6:0.93150362 \n-4 1:0.72043129 2:0.047285912 3:0.47217411 4:0.83138845 5:0.8704229 6:0.88419439 \n-4 1:0.67287449 2:0.047285912 3:0.44652268 4:0.77691812 5:0.81043483 6:0.8177009 \n-4 1:0.8023177 2:0.047285912 3:0.51559706 4:0.90512389 5:0.93743101 6:0.9492968 \n-4 1:0.76751376 2:0.047285912 3:0.49225957 4:0.86357299 5:0.89948127 6:0.91221155 \n-4 1:0.72124785 2:0.047285912 3:0.46606653 4:0.81323145 5:0.84847474 6:0.85892657 \n",
"feature_dict": {
"VMAF_feature": [
"VMAF_integer_feature": [
"vif_scale0",
"vif_scale1",
"vif_scale2",
......
......@@ -145,7 +145,7 @@ class QualityRunnerTest(unittest.TestCase):
delete_workdir=True,
result_store=None,
optional_dict={
'model_filepath': VmafConfig.model_path("vmaf_v0.6.1.json"),
'model_filepath': VmafConfig.model_path("vmaf_float_v0.6.1.json"),
},
)
self.runner.run()
......@@ -170,7 +170,7 @@ class QualityRunnerTest(unittest.TestCase):
delete_workdir=True,
result_store=None,
optional_dict={
'model_filepath': VmafConfig.model_path("vmaf_v0.6.1.json"),
'model_filepath': VmafConfig.model_path("vmaf_float_v0.6.1.json"),
},
)
self.runner.run()
......
......@@ -65,7 +65,7 @@ class QualityRunnerTest(unittest.TestCase):
delete_workdir=True,
result_store=None,
optional_dict={
'model_filepath': VmafConfig.model_path("vmaf_v0.6.1.json"),
'model_filepath': VmafConfig.model_path("vmaf_float_v0.6.1.json"),
},
)
self.runner.run()
......@@ -91,7 +91,7 @@ class QualityRunnerTest(unittest.TestCase):
delete_workdir=True,
result_store=None,
optional_dict={
'model_filepath': VmafConfig.model_path("vmaf_v0.6.1.json"),
'model_filepath': VmafConfig.model_path("vmaf_float_v0.6.1.json"),
},
)
self.runner.run()
......
......@@ -108,7 +108,7 @@ class LocalExplainerTest(unittest.TestCase):
None, fifo_mode=True,
delete_workdir=True,
result_store=None,
optional_dict={'model_filepath': VmafConfig.model_path("vmaf_v0.6.1.json")},
optional_dict={'model_filepath': VmafConfig.model_path("vmaf_float_v0.6.1.json")},
optional_dict2={'explainer': LocalExplainer(neighbor_samples=100)}
)
......
......@@ -253,7 +253,7 @@ class QualityRunnerTest(unittest.TestCase):
delete_workdir=True,
result_store=None,
optional_dict={
'model_filepath': VmafConfig.model_path("vmaf_v0.6.1.json"),
'model_filepath': VmafConfig.model_path("vmaf_float_v0.6.1.json"),
}
)
self.runner.run(parallelize=True)
......@@ -524,7 +524,7 @@ class QualityRunnerTest(unittest.TestCase):
delete_workdir=True,
result_store=None,
optional_dict={
'model_filepath': [VmafConfig.model_path("vmaf_v0.6.1.pkl"), VmafConfig.model_path("other_models", "vmaf_v0.6.0.pkl")],
'model_filepath': [VmafConfig.model_path("vmaf_float_v0.6.1.pkl"), VmafConfig.model_path("other_models", "vmaf_v0.6.0.pkl")],
},
)
self.runner.run(parallelize=True)
......@@ -1288,7 +1288,7 @@ class QualityRunnerTest(unittest.TestCase):
delete_workdir=True,
result_store=None,
optional_dict={
'model_filepath': VmafConfig.model_path("vmaf_v0.6.1.json"),
'model_filepath': VmafConfig.model_path("vmaf_float_v0.6.1.json"),
}
)
self.runner.run(parallelize=True)
......@@ -1331,7 +1331,7 @@ class QualityRunnerTest(unittest.TestCase):
delete_workdir=True,
result_store=None,
optional_dict={
'model_filepath': VmafConfig.model_path("vmaf_b_v0.6.3.json"),
'model_filepath': VmafConfig.model_path("vmaf_float_b_v0.6.3.json"),
}
)
self.runner.run(parallelize=True)
......
......@@ -263,7 +263,7 @@ class ResultAggregatingTest(unittest.TestCase):
delete_workdir=True,
result_store=None,
optional_dict={
'model_filepath': VmafConfig.model_path("vmaf_v0.6.1.pkl"),
'model_filepath': VmafConfig.model_path("vmaf_float_v0.6.1.pkl"),
},
optional_dict2=None,
)
......@@ -313,7 +313,7 @@ class ScoreAggregationTest(unittest.TestCase):
[asset], None, fifo_mode=True,
delete_workdir=True, result_store=FileSystemResultStore(),
optional_dict={
'model_filepath': VmafConfig.model_path("vmaf_v0.6.1.json"),
'model_filepath': VmafConfig.model_path("vmaf_float_v0.6.1.json"),
},
)
self.runner.run()
......
......@@ -320,7 +320,7 @@ class TestTrainOnDataset(unittest.TestCase):
test_dataset = import_python_file(
VmafConfig.test_resource_path('dataset_sample.py'))
test_assets, results = run_test_on_dataset(test_dataset, VmafQualityRunner, None,
None, VmafConfig.model_path("vmaf_v0.6.1.json"),
None, VmafConfig.model_path("vmaf_float_v0.6.1.json"),
parallelize=True,
aggregate_method=None)
......@@ -338,7 +338,7 @@ class TestTrainOnDataset(unittest.TestCase):
test_dataset = import_python_file(
VmafConfig.test_resource_path('dataset_sample.py'))
test_assets, results = run_test_on_dataset(test_dataset, BootstrapVmafQualityRunner, None,
None, VmafConfig.model_path("vmaf_b_v0.6.3.json"),
None, VmafConfig.model_path("vmaf_float_b_v0.6.3.json"),
parallelize=True,
aggregate_method=None)
......@@ -359,7 +359,7 @@ class TestTrainOnDataset(unittest.TestCase):
from vmaf.routine import run_test_on_dataset
test_dataset = import_python_file(VmafConfig.test_resource_path('dataset_sample.py'))
test_assets, results = run_test_on_dataset(test_dataset, VmafQualityRunner, None,
None, VmafConfig.model_path("vmaf_v0.6.1.json"), parallelize=False,
None, VmafConfig.model_path("vmaf_float_v0.6.1.json"), parallelize=False,
aggregate_method=None,
split_test_indices_for_perf_ci=True,
n_splits_test_indices=10)
......@@ -371,7 +371,7 @@ class TestTrainOnDataset(unittest.TestCase):
test_dataset = import_python_file(
VmafConfig.test_resource_path('raw_dataset_sample.py'))
test_assets, results = run_test_on_dataset(test_dataset, VmafQualityRunner, None,
None, VmafConfig.model_path("vmaf_v0.6.1.json"),
None, VmafConfig.model_path("vmaf_float_v0.6.1.json"),
parallelize=True,
aggregate_method=None)
......@@ -393,7 +393,7 @@ class TestTrainOnDataset(unittest.TestCase):
test_dataset = import_python_file(
VmafConfig.test_resource_path('raw_dataset_sample.py'))
test_assets, results = run_test_on_dataset(test_dataset, VmafQualityRunner, None,
None, VmafConfig.model_path("vmaf_v0.6.1.json"),
None, VmafConfig.model_path("vmaf_float_v0.6.1.json"),
parallelize=True,
aggregate_method=None,
subj_model_class=MosModel)
......
......@@ -862,7 +862,7 @@ class VmafossexecQualityRunnerSubsamplingTest(unittest.TestCase):
None, fifo_mode=False,
delete_workdir=True,
result_store=None,
optional_dict={'disable_clip_score': True, 'model_filepath': VmafConfig.model_path("vmaf_v0.6.1neg.pkl")}
optional_dict={'disable_clip_score': True, 'model_filepath': VmafConfig.model_path("vmaf_float_v0.6.1neg.pkl")}
)
self.runner.run(parallelize=False)
......@@ -890,7 +890,7 @@ class VmafossexecQualityRunnerSubsamplingTest(unittest.TestCase):
None, fifo_mode=False,
delete_workdir=True,
result_store=None,
optional_dict={'disable_clip_score': True, 'model_filepath': VmafConfig.model_path("vmaf_v0.6.1neg.json")}
optional_dict={'disable_clip_score': True, 'model_filepath': VmafConfig.model_path("vmaf_float_v0.6.1neg.json")}
)
self.runner.run(parallelize=False)
......
......@@ -198,8 +198,8 @@ class VmafrcQualityRunnerTest(unittest.TestCase):
result_store=None,
optional_dict={
'models': [
'path={}:name=custom_vmaf_0'.format(VmafConfig.model_path("vmaf_v0.6.1.pkl")),
'path={}:name=custom_vmaf_1'.format(VmafConfig.model_path("vmaf_v0.6.1.pkl")),
'path={}:name=custom_vmaf_0'.format(VmafConfig.model_path("vmaf_float_v0.6.1.pkl")),
'path={}:name=custom_vmaf_1'.format(VmafConfig.model_path("vmaf_float_v0.6.1.pkl")),
]
}
)
......@@ -223,8 +223,8 @@ class VmafrcQualityRunnerTest(unittest.TestCase):
result_store=None,
optional_dict={
'models': [
'path={}:name=custom_vmaf_0'.format(VmafConfig.model_path("vmaf_v0.6.1.pkl")),
'path={}:name=custom_vmaf_1:enable_transform'.format(VmafConfig.model_path("vmaf_v0.6.1.pkl")),
'path={}:name=custom_vmaf_0'.format(VmafConfig.model_path("vmaf_float_v0.6.1.pkl")),
'path={}:name=custom_vmaf_1:enable_transform'.format(VmafConfig.model_path("vmaf_float_v0.6.1.pkl")),
]
}
)
......@@ -865,7 +865,7 @@ class VmafrcQualityRunnerTest(unittest.TestCase):
None, fifo_mode=False,
delete_workdir=True,
result_store=None,
optional_dict={'disable_clip_score': True, 'model_filepath': VmafConfig.model_path("vmaf_v0.6.1neg.pkl")}
optional_dict={'disable_clip_score': True, 'model_filepath': VmafConfig.model_path("vmaf_float_v0.6.1neg.pkl")}
)
self.runner.run(parallelize=False)
......@@ -893,7 +893,7 @@ class VmafrcQualityRunnerTest(unittest.TestCase):
None, fifo_mode=False,
delete_workdir=True,
result_store=None,
optional_dict={'disable_clip_score': True, 'model_filepath': VmafConfig.model_path("vmaf_v0.6.1neg.pkl"),
optional_dict={'disable_clip_score': True, 'model_filepath': VmafConfig.model_path("vmaf_float_v0.6.1neg.pkl"),
'adm_enhn_gain_limit': 1.2}
)
self.runner.run(parallelize=False)
......@@ -919,12 +919,12 @@ class VmafrcQualityRunnerTest(unittest.TestCase):
result_store=None,
optional_dict={
'models': [
'path={}:name=vmaf'.format(VmafConfig.model_path("vmaf_v0.6.1.pkl")),
'path={}:name=vmafneg'.format(VmafConfig.model_path("vmaf_v0.6.1neg.pkl")),
'path={}:name=vmaf'.format(VmafConfig.model_path("vmaf_float_v0.6.1.pkl")),
'path={}:name=vmafneg'.format(VmafConfig.model_path("vmaf_float_v0.6.1neg.pkl")),
]
}
)
with self.assertRaises(AssertionError, msg="vmaf_v0.6.1.pkl and vmaf_v0.6.1neg.pkl require the same fex with "