Mohcine Madkour
Published

Sat 12 October 2019

←Home

New data augmentation techniques: cutout, mixup & cutmix: Part 3

As you may know, Jeremy Howard claims in his excellent fastai course that data augmentation is perhaps the most important regularization technique when training a model for Computer Vision, second only to getting more data samples (which is often costly or just impossible).

During the last 2 years a number of data augmentation techniques have been developed with excellent results in vision datasets.

In this notebook we'll see how you can easily apply some of this new data augmentation techniques to time series using fastai, fastai_timeseries and torchtimeseries.models library available at timeseriesAI

%%javascript
utils.load_extension('collapsible_headings/main')
utils.load_extension('hide_input/main')
utils.load_extension('autosavetime/main')
utils.load_extension('execute_time/ExecuteTime')
utils.load_extension('code_prettify/code_prettify')
utils.load_extension('scroll_down/main')
utils.load_extension('jupyter-js-widgets/extension')
<IPython.core.display.Javascript object>

Import libraries 📚

%reload_ext autoreload
%autoreload 2
%matplotlib inline
from fastai_timeseries import *
from torchtimeseries.models import *

Prepare data 🔢

First we'll create a databunch for the 'OliveOil' UCR dataset. You can select any other dataset.

dsid = 'Beef'
db = create_UCR_databunch(dsid)
db
TSDataBunch;

Train: LabelList (30 items)
x: TimeSeriesList
TimeSeries(ch=1, seq_len=470),TimeSeries(ch=1, seq_len=470),TimeSeries(ch=1, seq_len=470),TimeSeries(ch=1, seq_len=470),TimeSeries(ch=1, seq_len=470)
y: CategoryList
1,1,1,1,1
Path: .;

Valid: LabelList (30 items)
x: TimeSeriesList
TimeSeries(ch=1, seq_len=470),TimeSeries(ch=1, seq_len=470),TimeSeries(ch=1, seq_len=470),TimeSeries(ch=1, seq_len=470),TimeSeries(ch=1, seq_len=470)
y: CategoryList
1,1,1,1,1
Path: .;

Test: None

Once the databunch's been built, we can easy visualize time series and their classes using the show_batch method.

☣️ Remember that these charts represent different time series with their respective labels.

db.show_batch()

png

Build learner 🏗

Now I'll create a learner object. As a model I'll use the new Inceptiontime.

learn = Learner(db, InceptionTime(db.features, db.c), metrics=accuracy, 
                loss_func=LabelSmoothingCrossEntropy())

Train 🚵🏼‍♀️

learn.fit_one_cycle(200)
epoch train_loss valid_loss accuracy time
0 1.697133 1.636551 0.200000 00:02
1 1.658765 1.634878 0.200000 00:02
2 1.628749 1.633871 0.200000 00:02
3 1.603675 1.633338 0.200000 00:02
4 1.581709 1.633209 0.200000 00:02
5 1.562016 1.633017 0.200000 00:02
6 1.543772 1.632773 0.200000 00:02
7 1.526412 1.632591 0.200000 00:02
8 1.509713 1.632315 0.200000 00:02
9 1.493474 1.631614 0.200000 00:02
10 1.477614 1.630401 0.200000 00:02
11 1.462065 1.628628 0.200000 00:02
12 1.446795 1.626409 0.200000 00:02
13 1.431797 1.623053 0.200000 00:02
14 1.417064 1.619603 0.200000 00:02
15 1.402596 1.613142 0.200000 00:02
16 1.388477 1.609142 0.200000 00:02
17 1.374838 1.595912 0.200000 00:02
18 1.361802 1.584519 0.200000 00:02
19 1.348744 1.574727 0.200000 00:02
20 1.335601 1.558694 0.266667 00:02
21 1.322362 1.539030 0.366667 00:02
22 1.309687 1.525543 0.400000 00:02
23 1.297099 1.525444 0.333333 00:02
24 1.287485 1.487910 0.366667 00:02
25 1.278417 1.451224 0.366667 00:02
26 1.267326 1.419333 0.433333 00:02
27 1.255071 1.531612 0.266667 00:02
28 1.245143 1.428932 0.333333 00:02
29 1.233745 1.673071 0.333333 00:02
30 1.222732 1.317958 0.400000 00:02
31 1.210769 1.317483 0.433333 00:02
32 1.197816 1.636617 0.333333 00:02
33 1.183761 1.968882 0.300000 00:02
34 1.169847 1.360843 0.400000 00:02
35 1.156803 1.217057 0.533333 00:02
36 1.142337 2.626117 0.366667 00:02
37 1.130414 1.677359 0.433333 00:02
38 1.122333 1.932176 0.400000 00:02
39 1.116616 1.541641 0.466667 00:02
40 1.105934 2.903922 0.366667 00:02
41 1.094385 5.271531 0.233333 00:02
42 1.082686 5.667654 0.266667 00:02
43 1.070797 4.902615 0.266667 00:02
44 1.058120 7.664695 0.233333 00:02
45 1.047126 4.719625 0.400000 00:02
46 1.040817 3.125497 0.333333 00:02
47 1.033824 5.493695 0.333333 00:02
48 1.025297 6.714429 0.366667 00:02
49 1.016156 7.241136 0.333333 00:02
50 1.004743 6.179959 0.366667 00:02
51 0.993645 8.953449 0.333333 00:02
52 0.982641 11.618028 0.266667 00:02
53 0.972119 8.661292 0.333333 00:02
54 0.961185 7.684217 0.333333 00:02
55 0.949132 9.823573 0.300000 00:02
56 0.937367 10.694403 0.333333 00:02
57 0.925187 12.210872 0.266667 00:02
58 0.913146 14.403736 0.233333 00:02
59 0.901192 13.310433 0.233333 00:02
60 0.889542 12.824086 0.266667 00:02
61 0.879126 14.573278 0.233333 00:02
62 0.870193 11.141336 0.333333 00:02
63 0.861951 4.968851 0.333333 00:02
64 0.854350 9.333371 0.333333 00:02
65 0.847897 2.665716 0.500000 00:02
66 0.842686 2.313952 0.366667 00:02
67 0.834853 3.875152 0.366667 00:02
68 0.827271 2.235546 0.433333 00:02
69 0.819326 2.601296 0.400000 00:02
70 0.811829 3.508690 0.466667 00:02
71 0.804304 4.183553 0.366667 00:02
72 0.797064 3.399729 0.366667 00:02
73 0.789246 2.493759 0.533333 00:02
74 0.781297 1.851689 0.466667 00:02
75 0.773465 1.665282 0.433333 00:02
76 0.765998 5.534258 0.266667 00:02
77 0.758704 1.918662 0.466667 00:02
78 0.750972 2.003186 0.500000 00:02
79 0.743452 1.611230 0.533333 00:02
80 0.736011 1.628435 0.533333 00:02
81 0.728675 1.465575 0.600000 00:02
82 0.721576 0.969776 0.733333 00:02
83 0.714612 0.898719 0.700000 00:02
84 0.707672 1.032310 0.666667 00:02
85 0.700603 1.012637 0.700000 00:02
86 0.693883 1.637516 0.500000 00:02
87 0.687245 1.379606 0.566667 00:02
88 0.680597 0.982019 0.700000 00:02
89 0.674161 1.008814 0.700000 00:02
90 0.667775 1.266140 0.600000 00:02
91 0.661540 1.079461 0.600000 00:02
92 0.655445 0.770133 0.833333 00:02
93 0.649447 0.776157 0.833333 00:02
94 0.643603 0.780240 0.833333 00:02
95 0.637886 0.794417 0.766667 00:02
96 0.632302 0.794495 0.766667 00:02
97 0.626835 0.836716 0.733333 00:02
98 0.621506 0.952855 0.700000 00:02
99 0.616283 1.031987 0.633333 00:02
100 0.611199 0.992564 0.666667 00:02
101 0.606229 0.990812 0.666667 00:02
102 0.601378 1.038975 0.633333 00:02
103 0.596639 0.925851 0.700000 00:02
104 0.592018 0.839252 0.766667 00:02
105 0.587503 0.846729 0.766667 00:02
106 0.583092 0.876285 0.766667 00:02
107 0.578791 0.867612 0.800000 00:02
108 0.574593 0.865421 0.833333 00:02
109 0.570497 0.861011 0.833333 00:02
110 0.566498 0.860369 0.800000 00:02
111 0.562593 0.856723 0.800000 00:02
112 0.558781 0.857777 0.833333 00:02
113 0.555063 0.861587 0.833333 00:02
114 0.551431 0.864382 0.833333 00:02
115 0.547886 0.866298 0.833333 00:02
116 0.544424 0.866298 0.833333 00:02
117 0.541043 0.876166 0.766667 00:02
118 0.537743 0.877081 0.766667 00:02
119 0.534521 0.870586 0.833333 00:02
120 0.531374 0.871706 0.833333 00:02
121 0.528300 0.868688 0.833333 00:02
122 0.525299 0.864780 0.833333 00:02
123 0.522367 0.863685 0.833333 00:02
124 0.519504 0.860523 0.866667 00:02
125 0.516706 0.858484 0.866667 00:02
126 0.513973 0.856415 0.866667 00:02
127 0.511304 0.855038 0.866667 00:02
128 0.508696 0.854263 0.866667 00:02
129 0.506147 0.853070 0.866667 00:02
130 0.503657 0.851796 0.833333 00:02
131 0.501223 0.852567 0.800000 00:02
132 0.498845 0.849896 0.800000 00:02
133 0.496521 0.847939 0.833333 00:02
134 0.494250 0.847190 0.833333 00:02
135 0.492030 0.846976 0.833333 00:02
136 0.489860 0.848106 0.833333 00:02
137 0.487739 0.849875 0.833333 00:02
138 0.485665 0.849752 0.833333 00:02
139 0.483638 0.850015 0.833333 00:02
140 0.481656 0.851885 0.833333 00:02
141 0.479719 0.855177 0.833333 00:02
142 0.477824 0.856939 0.833333 00:02
143 0.475971 0.856017 0.833333 00:02
144 0.474160 0.854898 0.833333 00:02
145 0.472389 0.854715 0.833333 00:02
146 0.470656 0.854882 0.833333 00:02
147 0.468962 0.854378 0.833333 00:02
148 0.467305 0.853540 0.833333 00:02
149 0.465685 0.853437 0.833333 00:02
150 0.464100 0.854188 0.833333 00:02
151 0.462550 0.855247 0.833333 00:02
152 0.461033 0.855636 0.833333 00:02
153 0.459550 0.855190 0.833333 00:02
154 0.458098 0.854677 0.833333 00:02
155 0.456679 0.854630 0.833333 00:02
156 0.455290 0.855154 0.833333 00:02
157 0.453931 0.855961 0.833333 00:02
158 0.452602 0.856542 0.833333 00:02
159 0.451301 0.856716 0.833333 00:02
160 0.450029 0.856649 0.833333 00:02
161 0.448783 0.856666 0.833333 00:02
162 0.447565 0.856919 0.833333 00:02
163 0.446372 0.857375 0.833333 00:02
164 0.445206 0.857843 0.833333 00:02
165 0.444064 0.858171 0.833333 00:02
166 0.442946 0.858309 0.833333 00:02
167 0.441853 0.858343 0.833333 00:02
168 0.440783 0.858372 0.833333 00:02
169 0.439735 0.858487 0.833333 00:02
170 0.438710 0.858700 0.833333 00:02
171 0.437707 0.858980 0.833333 00:02
172 0.436725 0.859262 0.833333 00:02
173 0.435763 0.859469 0.833333 00:02
174 0.434822 0.859581 0.833333 00:02
175 0.433901 0.859597 0.833333 00:02
176 0.433000 0.859544 0.833333 00:02
177 0.432117 0.859459 0.833333 00:02
178 0.431253 0.859385 0.833333 00:02
179 0.430408 0.859341 0.833333 00:02
180 0.429580 0.859343 0.833333 00:02
181 0.428769 0.859393 0.833333 00:02
182 0.427976 0.859490 0.833333 00:02
183 0.427199 0.859620 0.833333 00:02
184 0.426438 0.859767 0.833333 00:02
185 0.425694 0.859917 0.833333 00:02
186 0.424965 0.860061 0.833333 00:02
187 0.424251 0.860193 0.833333 00:02
188 0.423552 0.860311 0.833333 00:02
189 0.422868 0.860409 0.833333 00:02
190 0.422198 0.860496 0.833333 00:02
191 0.421542 0.860570 0.833333 00:02
192 0.420899 0.860630 0.833333 00:02
193 0.420270 0.860679 0.833333 00:02
194 0.419654 0.860717 0.833333 00:02
195 0.419051 0.860749 0.833333 00:02
196 0.418461 0.860777 0.833333 00:02
197 0.417882 0.860800 0.833333 00:02
198 0.417316 0.860822 0.833333 00:02
199 0.416761 0.860841 0.833333 00:02

83.3% is a pretty good result with the Beef dataset. But let's see if we can improve it even further by using data augmentation.

Applying data augmentation techniques

In some cases, data augmentation is applied to a single time series. Changes are applied to that individual time series. One of these techniques is Cutout.

More recently, new data augmentations have appeared that combine a time series with another randomly selected time series, blending both in some way. 2 important techniques applicable to time series are Mixup and CutMix.

All these techniques work really well in images, but are not still often used with time series.

Data augmentation: Single Time Series

You'll see that applying these techniques is super easy. You only need to add the required callback.

Cutout (DeVries, 2017)

https://arxiv.org/abs/1708.04552

This is a single item transformation, where a random section of a time series is is replaced by zero.

You can apply all thes techniques in 2 ways (the result is exactly the same):

learn = Learner(db, InceptionTime(db.features, db.c))
learn.cutout();
learn = Learner(db, InceptionTime(db.features, db.c)).cutout()

Since you cannot see the impact of the technique, I've built a function (show_tfms) to be able to easily visualize it.

learn = Learner(db, InceptionTime(db.features, db.c)).cutout().show_tfms();

png

☣️ Remember that all these are examples of the same time series, once cutout has been applied. All techniques in this notebook are applied randomly on the fly, thus generating an endless amount of variations.

Parameter

These techniques have a parameter that define the amount of change from the original time series. It's called alpha.

For cutout, the default alpha is set to 1, but you can modify it up or down, depending on how much regularization you want to apply.

learn = Learner(db, InceptionTime(db.features, db.c)).cutout(alpha=1).show_tfms();

png

learn = Learner(db, InceptionTime(db.features, db.c)).cutout(alpha=.2).show_tfms();

png

learn = Learner(db, InceptionTime(db.features, db.c)).cutout(alpha=2.).show_tfms();

png

The default value is reasonable, but feel free to modify it.

Data augmentation: Multi Time Series

There are at least a couple of things multiTS data transforms have in common:

  • they combine 2 or more TS to create a new synthetic TS
  • unlike previous techniques like cutout, the entire TS provides informative datapoints.

Mixup (Zhang, 2018)

https://arxiv.org/abs/1710.09412

Mixup blends two time series randomly drawn from our training data. A weight λ (between .5-1) is assigned to the first sample, and 1-λ to the second one. Despite its simplicity, mixup allows a new state-of-the-art performance in the CIFAR-10, CIFAR- 100, and ImageNet-2012 image classification datasets, and can also improve performance in time series problems.

learn = Learner(db, InceptionTime(db.features, db.c)).mixup(alpha=.4).show_tfms();

png

Mixup creates time series that look very 'real', based on a weighted average of 2 time series.

The parameter for mixup is called alpha, and it's default value set to .4. Usual values range between .2-.4, although you can use any number greater than 0.

Cutmix (Yun, 2019)

https://arxiv.org/abs/1905.04899

Cutmix is similar to Cutout, as a single patch is cut and pasted into a different training Time Series.

CutMix consistently outperforms the state-of-the-art augmentation strategies on CIFAR and ImageNet classification tasks, as well as on the ImageNet weakly- supervised localization task.

learn = Learner(db, InceptionTime(db.features, db.c)).cutmix(alpha=1.).show_tfms();

png

For cutmix the default value of alpha is also 1.

How to train using data augmentation?

It's super easy! The only thing you need to do is:

  1. First you will create your ImageDataBunch as you would normally do.
  2. The you will create the learner as usual, but you will add to it the selected augmentation you have selected (cutmix, mixup or cutmix). You can only select one of these new data augmentations at a time.

Mixup

dsis = 'Beef'
db = create_UCR_databunch(dsid)
learn = Learner(db, InceptionTime(db.features, db.c), metrics=accuracy, 
                loss_func=LabelSmoothingCrossEntropy()).mixup()
  1. If you want to visualize the effect data augmentation before training (to adjust alpha for example), just add show_tfms()
learn.show_tfms();

png

That's it!!. You are now ready to train with data augmentation!!

learn.fit_one_cycle(200)
epoch train_loss valid_loss accuracy time
0 1.689828 1.619471 0.200000 00:02
1 1.673922 1.618781 0.200000 00:02
2 1.647962 1.618125 0.200000 00:02
3 1.631207 1.617363 0.200000 00:02
4 1.611909 1.616646 0.200000 00:02
5 1.595307 1.615953 0.200000 00:02
6 1.578687 1.615269 0.200000 00:02
7 1.563238 1.614500 0.200000 00:02
8 1.548130 1.613729 0.200000 00:02
9 1.534638 1.612713 0.200000 00:02
10 1.522140 1.611550 0.200000 00:02
11 1.512481 1.610328 0.200000 00:02
12 1.499381 1.608953 0.200000 00:02
13 1.486657 1.607466 0.200000 00:02
14 1.474970 1.605166 0.200000 00:02
15 1.464807 1.602398 0.200000 00:02
16 1.450602 1.599006 0.200000 00:02
17 1.441010 1.592957 0.200000 00:02
18 1.431140 1.585209 0.200000 00:02
19 1.420655 1.575528 0.200000 00:02
20 1.411091 1.568540 0.233333 00:02
21 1.401187 1.557126 0.300000 00:02
22 1.393381 1.547969 0.300000 00:02
23 1.386891 1.509804 0.333333 00:02
24 1.380682 1.517822 0.266667 00:02
25 1.372284 1.507112 0.300000 00:02
26 1.363400 1.529526 0.233333 00:02
27 1.356471 1.546933 0.266667 00:02
28 1.345878 1.476422 0.366667 00:02
29 1.338359 1.489565 0.333333 00:02
30 1.328831 1.370115 0.366667 00:02
31 1.319710 1.492780 0.333333 00:02
32 1.310245 1.909312 0.200000 00:02
33 1.304384 1.942166 0.200000 00:02
34 1.298426 1.449292 0.466667 00:02
35 1.296161 1.392294 0.566667 00:02
36 1.287528 1.707830 0.433333 00:02
37 1.279770 1.918631 0.300000 00:02
38 1.270381 1.791980 0.366667 00:02
39 1.259860 3.575068 0.333333 00:02
40 1.254035 2.539945 0.400000 00:02
41 1.245580 1.968613 0.466667 00:02
42 1.237224 2.108842 0.333333 00:02
43 1.228153 2.946331 0.333333 00:02
44 1.220387 3.205979 0.433333 00:02
45 1.212358 3.939463 0.266667 00:02
46 1.208177 5.770495 0.200000 00:02
47 1.201450 3.143615 0.466667 00:02
48 1.195719 2.494654 0.333333 00:02
49 1.189962 4.610262 0.366667 00:02
50 1.184943 3.094051 0.433333 00:02
51 1.178857 4.785711 0.366667 00:02
52 1.173128 5.147786 0.333333 00:02
53 1.166027 4.195990 0.333333 00:02
54 1.158504 4.618266 0.366667 00:02
55 1.154156 5.224677 0.400000 00:02
56 1.146860 5.487449 0.433333 00:02
57 1.143896 3.752536 0.366667 00:02
58 1.139026 3.471848 0.366667 00:02
59 1.132638 3.616133 0.366667 00:02
60 1.127449 3.148611 0.333333 00:02
61 1.119508 2.473933 0.333333 00:02
62 1.111254 2.150029 0.266667 00:02
63 1.103141 1.433178 0.500000 00:02
64 1.095411 1.055387 0.633333 00:02
65 1.088881 1.327046 0.600000 00:02
66 1.081982 1.605395 0.566667 00:02
67 1.079995 2.084538 0.433333 00:02
68 1.077057 2.170066 0.400000 00:02
69 1.073893 2.078325 0.566667 00:02
70 1.070387 5.268016 0.266667 00:02
71 1.065864 10.335102 0.200000 00:02
72 1.064703 4.640256 0.266667 00:02
73 1.061717 1.847973 0.366667 00:02
74 1.060507 1.547381 0.533333 00:02
75 1.057674 1.439578 0.666667 00:02
76 1.055321 1.494718 0.666667 00:02
77 1.054715 2.734326 0.233333 00:02
78 1.049378 1.831880 0.533333 00:02
79 1.048055 1.342813 0.600000 00:02
80 1.045389 1.058503 0.600000 00:02
81 1.038875 2.418780 0.466667 00:02
82 1.036652 3.206536 0.500000 00:02
83 1.034173 2.932166 0.466667 00:02
84 1.030047 2.636901 0.400000 00:02
85 1.027156 2.956616 0.333333 00:02
86 1.020752 3.017718 0.366667 00:02
87 1.016966 2.859704 0.366667 00:02
88 1.011848 2.513342 0.366667 00:02
89 1.007957 2.365700 0.366667 00:02
90 1.003033 2.211102 0.400000 00:02
91 1.000160 2.693544 0.200000 00:02
92 0.995621 2.895486 0.233333 00:02
93 0.995153 2.610945 0.333333 00:02
94 0.991598 2.286873 0.200000 00:02
95 0.988696 2.728762 0.200000 00:02
96 0.987154 2.344796 0.300000 00:02
97 0.983275 3.232719 0.300000 00:02
98 0.980970 1.335799 0.500000 00:02
99 0.977147 1.167713 0.533333 00:02
100 0.972813 1.170871 0.700000 00:02
101 0.971278 1.362768 0.500000 00:02
102 0.969613 1.461215 0.600000 00:02
103 0.965762 1.476665 0.533333 00:02
104 0.960326 2.355848 0.533333 00:02
105 0.955352 2.981581 0.433333 00:02
106 0.949955 3.107812 0.433333 00:02
107 0.946355 2.925362 0.400000 00:02
108 0.940663 2.415063 0.433333 00:02
109 0.938540 1.688057 0.600000 00:02
110 0.937894 1.362045 0.566667 00:02
111 0.933151 1.243008 0.600000 00:02
112 0.932395 1.176888 0.566667 00:02
113 0.928695 1.349458 0.533333 00:02
114 0.926381 1.627350 0.400000 00:02
115 0.921778 1.600521 0.400000 00:02
116 0.917558 1.366820 0.533333 00:02
117 0.912869 1.430818 0.533333 00:02
118 0.909506 1.488323 0.466667 00:02
119 0.908174 1.238361 0.533333 00:02
120 0.906753 0.972730 0.633333 00:02
121 0.902651 0.901888 0.733333 00:02
122 0.899289 0.816972 0.766667 00:02
123 0.894561 0.944673 0.700000 00:02
124 0.892816 0.783612 0.833333 00:02
125 0.887467 0.708474 0.900000 00:02
126 0.885093 0.726909 0.866667 00:02
127 0.881646 0.995866 0.700000 00:02
128 0.875917 1.191868 0.600000 00:02
129 0.871996 1.267549 0.566667 00:02
130 0.869246 1.369829 0.466667 00:02
131 0.867821 1.436767 0.466667 00:02
132 0.864062 1.160231 0.566667 00:02
133 0.859628 1.105737 0.666667 00:02
134 0.859078 0.851334 0.800000 00:02
135 0.858720 0.778529 0.766667 00:02
136 0.856945 1.147635 0.700000 00:02
137 0.854129 1.129824 0.700000 00:02
138 0.852220 0.979698 0.766667 00:02
139 0.847723 0.809028 0.800000 00:02
140 0.843133 0.673137 0.933333 00:02
141 0.839500 0.736410 0.800000 00:02
142 0.835289 0.999134 0.666667 00:02
143 0.833026 1.256068 0.600000 00:02
144 0.833735 1.237204 0.633333 00:02
145 0.834965 1.282824 0.600000 00:02
146 0.833917 1.396549 0.566667 00:02
147 0.828235 1.317177 0.566667 00:02
148 0.827091 1.111997 0.666667 00:02
149 0.824293 0.851436 0.733333 00:02
150 0.823835 0.715306 0.866667 00:02
151 0.822530 0.689193 0.833333 00:02
152 0.817367 0.732036 0.800000 00:02
153 0.817202 0.856252 0.733333 00:02
154 0.816193 0.948537 0.666667 00:02
155 0.813613 1.066483 0.633333 00:02
156 0.811144 1.068854 0.666667 00:02
157 0.810831 1.071055 0.666667 00:02
158 0.808077 0.977296 0.666667 00:02
159 0.804630 0.838821 0.733333 00:02
160 0.802261 0.772696 0.833333 00:02
161 0.802727 0.791702 0.833333 00:02
162 0.798813 0.835375 0.833333 00:02
163 0.797573 0.865562 0.733333 00:02
164 0.798201 0.891660 0.733333 00:02
165 0.795692 0.892899 0.733333 00:02
166 0.793362 0.867864 0.733333 00:02
167 0.792883 0.810626 0.766667 00:02
168 0.791444 0.769112 0.766667 00:02
169 0.788628 0.742704 0.833333 00:02
170 0.787844 0.720856 0.833333 00:02
171 0.784512 0.715271 0.833333 00:02
172 0.782812 0.724146 0.833333 00:02
173 0.780747 0.742088 0.833333 00:02
174 0.779653 0.765256 0.800000 00:02
175 0.777027 0.789565 0.800000 00:02
176 0.777006 0.810619 0.766667 00:02
177 0.777498 0.812250 0.800000 00:02
178 0.774280 0.812091 0.800000 00:02
179 0.774260 0.810058 0.800000 00:02
180 0.773745 0.804928 0.800000 00:02
181 0.771602 0.797569 0.800000 00:02
182 0.770738 0.778828 0.800000 00:02
183 0.769796 0.761000 0.833333 00:02
184 0.769606 0.746166 0.833333 00:02
185 0.769999 0.735135 0.866667 00:02
186 0.769042 0.724371 0.866667 00:02
187 0.765944 0.714545 0.833333 00:02
188 0.766425 0.709452 0.833333 00:02
189 0.765466 0.703008 0.833333 00:02
190 0.761368 0.699016 0.866667 00:02
191 0.759790 0.696969 0.866667 00:02
192 0.758059 0.694304 0.866667 00:02
193 0.756838 0.691365 0.866667 00:02
194 0.756011 0.691939 0.866667 00:02
195 0.755389 0.692854 0.866667 00:02
196 0.754549 0.691379 0.866667 00:02
197 0.754667 0.690486 0.866667 00:02
198 0.754686 0.692739 0.866667 00:02
199 0.753276 0.694158 0.866667 00:02

☣️ I've built a chart to compare performance with and without mixup. As you can see, and this has occurred in many experiments I've done, it takes longer to get a high level of performance with mixup, but then it tends to keep growing more than without mixup. This is something to take into account when designing your experiments.

Cutmix

learn = Learner(db, InceptionTime(db.features, db.c), metrics=accuracy, 
                loss_func=LabelSmoothingCrossEntropy()).cutmix()
learn.fit_one_cycle(200)
epoch train_loss valid_loss accuracy time
0 1.706453 1.624805 0.200000 00:02
1 1.685710 1.624133 0.200000 00:02
2 1.661669 1.623917 0.200000 00:02
3 1.638445 1.623724 0.200000 00:02
4 1.624182 1.623581 0.200000 00:02
5 1.614200 1.623360 0.200000 00:02
6 1.604189 1.623062 0.200000 00:02
7 1.591671 1.622744 0.200000 00:02
8 1.579674 1.622297 0.200000 00:02
9 1.569131 1.621474 0.200000 00:02
10 1.561477 1.620105 0.200000 00:02
11 1.550535 1.618287 0.200000 00:02
12 1.537226 1.616055 0.200000 00:02
13 1.533442 1.613569 0.200000 00:02
14 1.529642 1.610154 0.200000 00:02
15 1.526031 1.606566 0.200000 00:02
16 1.516714 1.601131 0.200000 00:02
17 1.509571 1.593389 0.200000 00:02
18 1.506086 1.585316 0.333333 00:02
19 1.502734 1.573562 0.233333 00:02
20 1.499156 1.554533 0.466667 00:02
21 1.488737 1.531936 0.333333 00:02
22 1.484465 1.516183 0.333333 00:02
23 1.481679 1.487968 0.400000 00:02
24 1.480139 1.481755 0.266667 00:02
25 1.474577 1.457397 0.333333 00:02
26 1.471540 1.453811 0.333333 00:02
27 1.470596 1.458764 0.333333 00:02
28 1.469030 1.463857 0.333333 00:02
29 1.466600 1.523658 0.333333 00:02
30 1.464954 1.616127 0.333333 00:02
31 1.463124 1.639686 0.300000 00:02
32 1.458457 1.663750 0.300000 00:02
33 1.452542 1.680489 0.366667 00:02
34 1.448779 1.987832 0.233333 00:02
35 1.443480 1.724338 0.333333 00:02
36 1.439493 2.210143 0.233333 00:02
37 1.436260 2.561839 0.300000 00:02
38 1.435892 1.457669 0.400000 00:02
39 1.433157 1.703312 0.333333 00:02
40 1.431497 2.224642 0.366667 00:02
41 1.430971 2.875991 0.400000 00:02
42 1.428263 2.325544 0.333333 00:02
43 1.427699 1.876932 0.300000 00:02
44 1.422279 1.727904 0.333333 00:02
45 1.418207 1.762134 0.300000 00:02
46 1.416281 1.538334 0.466667 00:02
47 1.415849 1.947046 0.433333 00:02
48 1.412131 1.897876 0.433333 00:02
49 1.409524 2.145555 0.400000 00:02
50 1.404963 6.031844 0.333333 00:02
51 1.396030 6.310513 0.333333 00:02
52 1.396604 5.209888 0.333333 00:02
53 1.394261 4.222475 0.333333 00:02
54 1.391686 2.320999 0.400000 00:02
55 1.388773 1.465874 0.433333 00:02
56 1.386416 2.058067 0.366667 00:02
57 1.386596 3.500876 0.333333 00:02
58 1.384054 2.610673 0.366667 00:02
59 1.384070 2.270039 0.366667 00:02
60 1.383814 1.708934 0.466667 00:02
61 1.380550 1.421436 0.366667 00:02
62 1.382743 1.412282 0.433333 00:02
63 1.384651 2.413202 0.333333 00:02
64 1.380379 2.628878 0.333333 00:02
65 1.380742 2.428576 0.333333 00:02
66 1.378306 1.427382 0.433333 00:02
67 1.377688 1.407880 0.400000 00:02
68 1.377784 1.648727 0.366667 00:02
69 1.374915 1.851315 0.366667 00:02
70 1.372057 1.959223 0.333333 00:02
71 1.367003 1.710080 0.400000 00:02
72 1.366259 1.444495 0.333333 00:02
73 1.366753 1.294768 0.500000 00:02
74 1.362236 1.499587 0.366667 00:02
75 1.359724 2.356512 0.433333 00:02
76 1.359129 2.694739 0.400000 00:02
77 1.358984 2.540057 0.366667 00:02
78 1.357662 2.659967 0.333333 00:02
79 1.360360 2.046504 0.333333 00:02
80 1.356875 1.603783 0.366667 00:02
81 1.356482 1.273809 0.366667 00:02
82 1.354857 1.436357 0.466667 00:02
83 1.352965 1.385521 0.500000 00:02
84 1.348206 1.277132 0.566667 00:02
85 1.347547 1.401034 0.366667 00:02
86 1.346372 1.559262 0.400000 00:02
87 1.344510 1.635159 0.466667 00:02
88 1.344424 1.650026 0.466667 00:02
89 1.342335 1.612467 0.466667 00:02
90 1.341733 1.306708 0.500000 00:02
91 1.338287 1.169277 0.533333 00:02
92 1.337811 1.212823 0.600000 00:02
93 1.337693 1.435008 0.466667 00:02
94 1.331457 1.337413 0.400000 00:02
95 1.331445 1.105399 0.666667 00:02
96 1.331121 1.171597 0.533333 00:02
97 1.324570 1.264037 0.500000 00:02
98 1.322947 1.286684 0.400000 00:02
99 1.323301 1.347077 0.400000 00:02
100 1.321393 1.297375 0.366667 00:02
101 1.319361 1.191885 0.533333 00:02
102 1.315395 1.189938 0.600000 00:02
103 1.315025 1.259681 0.466667 00:02
104 1.308935 1.349674 0.400000 00:02
105 1.305654 1.378004 0.466667 00:02
106 1.305785 1.346191 0.433333 00:02
107 1.305715 1.356849 0.433333 00:02
108 1.304183 1.541104 0.400000 00:02
109 1.302752 1.282268 0.500000 00:02
110 1.303141 1.279333 0.500000 00:02
111 1.300831 1.186200 0.666667 00:02
112 1.300369 1.150524 0.666667 00:02
113 1.304650 1.354581 0.466667 00:02
114 1.303918 1.443647 0.433333 00:02
115 1.302446 1.253269 0.600000 00:02
116 1.299625 1.254723 0.533333 00:02
117 1.297959 1.355177 0.466667 00:02
118 1.295360 1.413822 0.500000 00:02
119 1.294235 1.272771 0.500000 00:02
120 1.294245 1.135344 0.600000 00:02
121 1.290968 1.179739 0.500000 00:02
122 1.291180 1.224161 0.533333 00:02
123 1.288947 1.178296 0.533333 00:02
124 1.286780 1.094188 0.566667 00:02
125 1.278990 1.079829 0.633333 00:02
126 1.280690 1.146522 0.633333 00:02
127 1.277659 1.112404 0.700000 00:02
128 1.275256 1.077389 0.566667 00:02
129 1.274448 1.371475 0.500000 00:02
130 1.273802 1.661546 0.533333 00:02
131 1.275651 1.745858 0.466667 00:02
132 1.273929 1.663937 0.500000 00:02
133 1.270650 1.477462 0.566667 00:02
134 1.269193 1.260925 0.566667 00:02
135 1.267992 1.182368 0.633333 00:02
136 1.265650 1.268212 0.633333 00:02
137 1.263543 1.349007 0.600000 00:02
138 1.262578 1.366768 0.533333 00:02
139 1.260029 1.291980 0.600000 00:02
140 1.258410 1.213087 0.666667 00:02
141 1.260078 1.172351 0.633333 00:02
142 1.258358 1.161846 0.666667 00:02
143 1.252187 1.153422 0.666667 00:02
144 1.251021 1.143518 0.666667 00:02
145 1.251198 1.133091 0.666667 00:02
146 1.249392 1.119584 0.666667 00:02
147 1.248203 1.107645 0.666667 00:02
148 1.247276 1.119978 0.633333 00:02
149 1.243595 1.146225 0.633333 00:02
150 1.241543 1.176757 0.633333 00:02
151 1.238316 1.200777 0.633333 00:02
152 1.237871 1.213209 0.633333 00:02
153 1.235683 1.203346 0.666667 00:02
154 1.233224 1.185343 0.666667 00:02
155 1.232618 1.150263 0.666667 00:02
156 1.233111 1.124269 0.700000 00:02
157 1.232996 1.082658 0.700000 00:02
158 1.231997 1.048401 0.733333 00:02
159 1.232146 1.027407 0.700000 00:02
160 1.232651 1.024449 0.666667 00:02
161 1.228856 1.025052 0.666667 00:02
162 1.230592 1.037613 0.633333 00:02
163 1.230132 1.037835 0.600000 00:02
164 1.226062 1.028810 0.633333 00:02
165 1.220277 1.017511 0.633333 00:02
166 1.219127 1.005891 0.666667 00:02
167 1.217834 0.992963 0.700000 00:02
168 1.215868 0.988810 0.700000 00:02
169 1.210998 0.987222 0.733333 00:02
170 1.205445 0.989531 0.733333 00:02
171 1.202443 0.994882 0.666667 00:02
172 1.200605 0.996973 0.666667 00:02
173 1.199470 0.996149 0.666667 00:02
174 1.196583 0.994541 0.666667 00:02
175 1.195204 0.992158 0.666667 00:02
176 1.194456 0.990747 0.666667 00:02
177 1.192639 0.986806 0.666667 00:02
178 1.189554 0.983644 0.666667 00:02
179 1.188267 0.980869 0.666667 00:02
180 1.185815 0.978657 0.666667 00:02
181 1.185371 0.974919 0.666667 00:02
182 1.183739 0.973566 0.700000 00:02
183 1.179749 0.973964 0.733333 00:02
184 1.177038 0.974452 0.733333 00:02
185 1.167234 0.976603 0.733333 00:02
186 1.166255 0.980606 0.766667 00:02
187 1.164219 0.982645 0.766667 00:02
188 1.161877 0.983812 0.766667 00:02
189 1.160285 0.987663 0.733333 00:02
190 1.158639 0.989382 0.766667 00:02
191 1.157610 0.992286 0.733333 00:02
192 1.156571 0.991146 0.766667 00:02
193 1.157177 0.990459 0.766667 00:02
194 1.159675 0.990053 0.800000 00:02
195 1.161296 0.990250 0.800000 00:02
196 1.160810 0.989652 0.800000 00:02
197 1.161151 0.988033 0.800000 00:02
198 1.160224 0.986137 0.800000 00:02
199 1.159139 0.985020 0.800000 00:02

Cutout

learn = Learner(db, InceptionTime(db.features, db.c), metrics=accuracy, 
                loss_func=LabelSmoothingCrossEntropy()).cutout()
learn.fit_one_cycle(200)
epoch train_loss valid_loss accuracy time
0 1.660396 1.620012 0.200000 00:02
1 1.637469 1.619628 0.200000 00:02
2 1.631980 1.619524 0.200000 00:02
3 1.618567 1.619128 0.200000 00:02
4 1.604060 1.618873 0.200000 00:02
5 1.591112 1.618706 0.200000 00:02
6 1.577234 1.618439 0.200000 00:02
7 1.566691 1.618009 0.200000 00:02
8 1.574861 1.617688 0.200000 00:02
9 1.565786 1.617103 0.200000 00:02
10 1.553699 1.616126 0.200000 00:02
11 1.540023 1.614528 0.200000 00:02
12 1.527577 1.612250 0.200000 00:02
13 1.518896 1.608595 0.200000 00:02
14 1.507092 1.604372 0.200000 00:02
15 1.501131 1.597838 0.200000 00:02
16 1.504705 1.588038 0.200000 00:02
17 1.493358 1.578237 0.266667 00:02
18 1.483621 1.567515 0.266667 00:02
19 1.474008 1.555968 0.300000 00:02
20 1.465212 1.544741 0.300000 00:02
21 1.454736 1.529409 0.333333 00:02
22 1.443652 1.509047 0.333333 00:02
23 1.434206 1.488219 0.333333 00:02
24 1.424903 1.487186 0.333333 00:02
25 1.414110 1.454912 0.366667 00:02
26 1.403112 1.411377 0.366667 00:02
27 1.400361 1.454942 0.300000 00:02
28 1.394396 2.853700 0.200000 00:02
29 1.386278 3.738343 0.200000 00:02
30 1.380608 3.026152 0.200000 00:02
31 1.375208 2.086259 0.300000 00:02
32 1.372596 2.029034 0.300000 00:02
33 1.368522 1.776222 0.300000 00:02
34 1.364299 1.564647 0.400000 00:02
35 1.356729 1.687982 0.400000 00:02
36 1.349027 1.828559 0.366667 00:02
37 1.341715 1.832533 0.333333 00:02
38 1.331327 1.929781 0.366667 00:02
39 1.324214 1.609313 0.400000 00:02
40 1.316651 3.501971 0.200000 00:02
41 1.309119 4.311689 0.200000 00:02
42 1.313026 13.606978 0.200000 00:02
43 1.311430 9.567814 0.200000 00:02
44 1.308879 7.534220 0.200000 00:02
45 1.306159 3.707384 0.233333 00:02
46 1.301580 2.191738 0.366667 00:02
47 1.295914 2.324064 0.366667 00:02
48 1.307469 9.194708 0.200000 00:02
49 1.302346 4.097297 0.333333 00:02
50 1.296714 3.400907 0.400000 00:02
51 1.291286 4.134081 0.366667 00:02
52 1.292245 9.854806 0.300000 00:02
53 1.291874 32.586937 0.200000 00:02
54 1.288160 29.200527 0.200000 00:02
55 1.284828 16.745173 0.200000 00:02
56 1.283797 11.435283 0.200000 00:02
57 1.277688 9.402293 0.200000 00:02
58 1.272753 7.304148 0.200000 00:02
59 1.266980 5.996444 0.200000 00:02
60 1.264276 6.278485 0.200000 00:02
61 1.258912 11.868817 0.200000 00:02
62 1.253075 9.906275 0.200000 00:02
63 1.252298 8.497372 0.200000 00:02
64 1.248089 8.165295 0.200000 00:02
65 1.244134 5.703846 0.200000 00:02
66 1.238663 3.617159 0.200000 00:02
67 1.232986 2.547888 0.300000 00:02
68 1.228218 2.249779 0.366667 00:02
69 1.234005 1.480027 0.400000 00:02
70 1.244705 5.166591 0.200000 00:02
71 1.243025 9.313371 0.200000 00:02
72 1.242977 9.378866 0.200000 00:02
73 1.242423 7.334600 0.200000 00:02
74 1.243052 6.318696 0.200000 00:02
75 1.247443 5.345894 0.200000 00:02
76 1.245416 4.845418 0.200000 00:02
77 1.244565 5.320508 0.200000 00:02
78 1.241501 3.297099 0.366667 00:02
79 1.236092 3.492207 0.333333 00:02
80 1.233410 3.885850 0.266667 00:02
81 1.227951 3.243068 0.300000 00:02
82 1.225131 1.910741 0.433333 00:02
83 1.223312 1.487790 0.466667 00:02
84 1.216726 1.447462 0.433333 00:02
85 1.220145 1.574987 0.433333 00:02
86 1.218359 2.194073 0.366667 00:02
87 1.215006 2.299147 0.366667 00:02
88 1.210736 2.274392 0.333333 00:02
89 1.207281 2.524296 0.266667 00:02
90 1.205115 2.238126 0.300000 00:02
91 1.204487 2.013067 0.433333 00:02
92 1.199850 2.660136 0.333333 00:02
93 1.196869 3.244203 0.333333 00:02
94 1.194110 3.232043 0.333333 00:02
95 1.192502 2.691142 0.333333 00:02
96 1.195390 2.010396 0.333333 00:02
97 1.190254 1.854620 0.366667 00:02
98 1.184024 1.996308 0.366667 00:02
99 1.177545 1.911293 0.366667 00:02
100 1.174493 1.540360 0.433333 00:02
101 1.167482 1.370175 0.400000 00:02
102 1.159704 1.308586 0.400000 00:02
103 1.153548 1.548130 0.366667 00:02
104 1.146871 1.348858 0.366667 00:02
105 1.146403 1.077422 0.700000 00:02
106 1.141683 1.196881 0.500000 00:02
107 1.135765 1.291074 0.500000 00:02
108 1.129827 1.262042 0.466667 00:02
109 1.123418 1.167186 0.700000 00:02
110 1.116877 1.232698 0.566667 00:02
111 1.113724 1.866791 0.433333 00:02
112 1.111017 2.025032 0.366667 00:02
113 1.109310 1.887846 0.366667 00:02
114 1.102472 1.619525 0.400000 00:02
115 1.095449 1.380898 0.500000 00:02
116 1.090489 1.227257 0.533333 00:02
117 1.085784 1.403344 0.533333 00:02
118 1.078969 1.733581 0.533333 00:02
119 1.074235 1.987759 0.533333 00:02
120 1.083529 1.523327 0.533333 00:02
121 1.083109 1.330128 0.466667 00:02
122 1.078421 1.807096 0.333333 00:02
123 1.077411 2.611065 0.266667 00:02
124 1.076933 3.256675 0.266667 00:02
125 1.078660 3.346977 0.266667 00:02
126 1.073926 2.891910 0.266667 00:02
127 1.072688 2.565595 0.333333 00:02
128 1.067860 2.095694 0.366667 00:02
129 1.068164 1.778149 0.433333 00:02
130 1.066739 1.461159 0.533333 00:02
131 1.063635 1.250543 0.666667 00:02
132 1.061969 1.226142 0.666667 00:02
133 1.057962 1.190449 0.666667 00:02
134 1.051249 1.223222 0.666667 00:02
135 1.046239 1.219140 0.666667 00:02
136 1.043587 1.158116 0.666667 00:02
137 1.037572 1.107074 0.733333 00:02
138 1.034447 1.136540 0.666667 00:02
139 1.027728 1.185949 0.633333 00:02
140 1.027412 1.308161 0.500000 00:02
141 1.021058 1.282450 0.500000 00:02
142 1.017411 1.306550 0.500000 00:02
143 1.016686 1.263149 0.500000 00:02
144 1.010793 1.198263 0.566667 00:02
145 1.008476 1.317031 0.533333 00:02
146 1.005065 1.481030 0.466667 00:02
147 0.999182 1.471093 0.466667 00:02
148 0.997035 1.308852 0.566667 00:02
149 0.994885 1.158710 0.600000 00:02
150 0.988591 1.089001 0.700000 00:02
151 0.983139 1.057717 0.700000 00:02
152 0.980064 1.021482 0.766667 00:02
153 0.976560 1.005424 0.766667 00:02
154 0.974256 1.056330 0.666667 00:02
155 0.970273 1.198120 0.566667 00:02
156 0.965092 1.395134 0.500000 00:02
157 0.962309 1.325926 0.533333 00:02
158 0.958749 1.232951 0.600000 00:02
159 0.953150 1.236645 0.566667 00:02
160 0.948714 1.141429 0.633333 00:02
161 0.943281 1.050981 0.700000 00:02
162 0.942111 1.027893 0.733333 00:02
163 0.935896 1.021700 0.700000 00:02
164 0.934182 1.058105 0.666667 00:02
165 0.931000 1.137445 0.733333 00:02
166 0.934147 1.215132 0.600000 00:02
167 0.930111 1.241952 0.600000 00:02
168 0.926065 1.256834 0.633333 00:02
169 0.920515 1.266970 0.633333 00:02
170 0.916091 1.295894 0.633333 00:02
171 0.916563 1.379379 0.533333 00:02
172 0.913004 1.357460 0.600000 00:02
173 0.909355 1.322893 0.633333 00:02
174 0.907756 1.289818 0.633333 00:02
175 0.905712 1.310621 0.666667 00:02
176 0.906727 1.289430 0.633333 00:02
177 0.906684 1.339022 0.633333 00:02
178 0.906677 1.321404 0.633333 00:02
179 0.903140 1.323079 0.633333 00:02
180 0.899265 1.351478 0.633333 00:02
181 0.895234 1.308878 0.666667 00:02
182 0.894889 1.251764 0.700000 00:02
183 0.893129 1.269285 0.700000 00:02
184 0.887717 1.244808 0.700000 00:02
185 0.884424 1.245035 0.700000 00:02
186 0.888531 1.224573 0.666667 00:02
187 0.885721 1.242810 0.600000 00:02
188 0.883915 1.238724 0.633333 00:02
189 0.880226 1.209234 0.666667 00:02
190 0.877577 1.236441 0.733333 00:02
191 0.874377 1.232569 0.733333 00:02
192 0.871007 1.249061 0.633333 00:02
193 0.873851 1.186095 0.700000 00:02
194 0.869425 1.165819 0.666667 00:02
195 0.865002 1.161543 0.666667 00:02
196 0.863245 1.160168 0.666667 00:02
197 0.861162 1.164579 0.700000 00:02
198 0.859056 1.186687 0.700000 00:02
199 0.862104 1.236917 0.700000 00:02

Scheduled data transformation

As a bonus, for those of you who enjoy more complex approaches, I've created a function that will allow you to automatically adjust the value of the alpha parameter during training. Let's see how it works.

☣️ Please, bear in mind that the minimum value of alpha for mixup is > 0 (this is a current fastai constraint). So if you want to start from 0, you can use something like .001 instead.

learn = Learner(db, InceptionTime(db.features, db.c), metrics=accuracy,
               loss_func=LabelSmoothingCrossEntropy())

tfm_fn = mixup
sch_param='alpha'
sch_val = (.001, 1.)              # values of parameter alpha (initial, final)
sch_iter = (0., .7)               # percent of training epochs (start, end)
sch_func = partial(annealing_cos) # annealing_cos, None = annealing_linear, cosine_annealing
plot = True
test = True                       # set to True for adjusting the values. When ready to train set to False
sch_tfm_cb = partial(TfmScheduler, tfm_fn=tfm_fn, sch_param=sch_param, sch_val=sch_val, 
                      sch_iter=sch_iter, sch_func=sch_func, plot=plot, test=test)
learn.callback_fns.append(sch_tfm_cb)

learn.fit_one_cycle(200)
 alpha between 0.001 and 1.0 in iters 0.00 to 0.70

png

<div>
    <style>
        /* Turns off some styling */
        progress {
            /* gets rid of default border in Firefox and Opera. */
            border: none;
            /* Needs to be in here for Safari polyfill so background images work as expected. */
            background-size: auto;
        }
        .progress-bar-interrupted, .progress-bar-interrupted::-webkit-progress-bar {
            background: #F44336;
        }
    </style>
  <progress value='0' class='' max='200', style='width:300px; height:20px; vertical-align: middle;'></progress>
  0.00% [0/200 00:00<00:00]
</div>
epoch train_loss valid_loss accuracy time

Interrupted
learn = Learner(db, InceptionTime(db.features, db.c), metrics=accuracy,
               loss_func=LabelSmoothingCrossEntropy())

tfm_fn = mixup
sch_param='alpha'
sch_val = (.001, 1.)              # values of parameter alpha (initial, final)
sch_iter = (0., .7)               # percent of training epochs (start, end)
sch_func = partial(annealing_cos) # annealing_cos, None = annealing_linear, cosine_annealing
plot = True
test = False                      # set to True for adjusting the values. When ready to train set to False
sch_tfm_cb = partial(TfmScheduler, tfm_fn=tfm_fn, sch_param=sch_param, sch_val=sch_val, 
                      sch_iter=sch_iter, sch_func=sch_func, plot=plot, test=test)
learn.callback_fns.append(sch_tfm_cb)

learn.fit_one_cycle(200)
alpha between 0.001 and 1.0 in iters 0.00 to 0.70 ![png](/images/TSC/output_63_1.png)
epoch train_loss valid_loss accuracy time
0 1.626387 1.627592 0.100000 00:02
1 1.601632 1.626852 0.166667 00:02
2 1.580567 1.626486 0.200000 00:02
3 1.561646 1.626253 0.200000 00:02
4 1.543889 1.625805 0.200000 00:02
5 1.526808 1.625060 0.200000 00:02
6 1.510128 1.624043 0.200000 00:02
7 1.493622 1.622928 0.200000 00:02
8 1.477247 1.621706 0.200000 00:02
9 1.461012 1.620356 0.200000 00:02
10 1.444964 1.618782 0.200000 00:02
11 1.429263 1.616995 0.200000 00:02
12 1.415805 1.614870 0.200000 00:02
13 1.401853 1.613464 0.200000 00:02
14 1.388338 1.609434 0.200000 00:02
15 1.374339 1.610430 0.200000 00:02
16 1.362183 1.602663 0.200000 00:02
17 1.348864 1.601656 0.200000 00:02
18 1.335258 1.606995 0.200000 00:02
19 1.321574 1.596013 0.200000 00:02
20 1.306098 1.592437 0.200000 00:02
21 1.293900 1.581123 0.200000 00:02
22 1.279845 1.564399 0.233333 00:02
23 1.271103 1.500022 0.266667 00:02
24 1.261507 1.535057 0.200000 00:02
25 1.248613 1.472087 0.366667 00:02
26 1.237349 1.417059 0.433333 00:02
27 1.225417 1.435310 0.400000 00:02
28 1.214563 1.365397 0.433333 00:02
29 1.203867 1.452246 0.366667 00:02
30 1.194183 1.263637 0.433333 00:02
31 1.185022 1.283553 0.366667 00:02
32 1.171834 1.517067 0.366667 00:02
33 1.163586 1.597277 0.400000 00:02
34 1.149949 1.694383 0.366667 00:02
35 1.138015 1.358102 0.533333 00:02
36 1.124267 2.482912 0.333333 00:02
37 1.120735 1.297349 0.533333 00:02
38 1.109981 2.298659 0.433333 00:02
39 1.103910 4.613716 0.400000 00:02
40 1.101009 9.654345 0.333333 00:02
41 1.094292 5.961040 0.266667 00:02
42 1.091424 4.990852 0.333333 00:02
43 1.085680 6.741884 0.333333 00:02
44 1.078918 1.311232 0.466667 00:02
45 1.077116 3.062059 0.366667 00:02
46 1.070217 1.283510 0.366667 00:02
47 1.064868 3.715569 0.366667 00:02
48 1.059177 1.724050 0.400000 00:02
49 1.054424 4.899724 0.333333 00:02
50 1.054437 2.283245 0.400000 00:02
51 1.053111 2.297719 0.466667 00:02
52 1.050523 2.093637 0.433333 00:02
53 1.047141 1.727470 0.466667 00:02
54 1.046703 4.376657 0.333333 00:02
55 1.043961 6.196053 0.366667 00:02
56 1.039703 6.152739 0.266667 00:02
57 1.035884 8.199553 0.200000 00:02
58 1.033973 9.609694 0.200000 00:02
59 1.030891 6.227965 0.233333 00:02
60 1.028900 5.746769 0.200000 00:02
61 1.027948 5.763941 0.200000 00:02
62 1.028211 2.939988 0.366667 00:02
63 1.028534 1.482004 0.400000 00:02
64 1.029271 4.230787 0.366667 00:02
65 1.029912 6.466330 0.333333 00:02
66 1.027463 3.925814 0.433333 00:02
67 1.025617 3.920921 0.466667 00:02
68 1.022813 3.472798 0.466667 00:02
69 1.018501 3.049520 0.366667 00:02
70 1.018678 2.491311 0.333333 00:02
71 1.017646 1.477216 0.300000 00:02
72 1.015955 2.422115 0.466667 00:02
73 1.011675 1.438160 0.366667 00:02
74 1.006047 1.375284 0.466667 00:02
75 1.002805 1.802636 0.466667 00:02
76 0.998354 3.754055 0.333333 00:02
77 0.994339 2.482006 0.400000 00:02
78 0.994594 1.123771 0.666667 00:02
79 0.992726 1.100370 0.700000 00:02
80 0.993869 1.098626 0.700000 00:02
81 0.989084 0.969644 0.700000 00:02
82 0.986687 1.031124 0.633333 00:02
83 0.985856 1.389311 0.466667 00:02
84 0.982548 1.677026 0.333333 00:02
85 0.980049 1.984690 0.233333 00:02
86 0.977262 2.663417 0.233333 00:02
87 0.974976 2.417084 0.233333 00:02
88 0.975138 1.385765 0.533333 00:02
89 0.972034 1.563319 0.433333 00:02
90 0.969463 1.633355 0.500000 00:02
91 0.968935 2.039129 0.533333 00:02
92 0.967603 1.541844 0.600000 00:02
93 0.968810 1.315756 0.600000 00:02
94 0.970300 3.284852 0.400000 00:02
95 0.971463 3.589286 0.333333 00:02
96 0.971568 2.353341 0.400000 00:02
97 0.970776 2.591434 0.400000 00:02
98 0.970075 2.927469 0.333333 00:02
99 0.971289 2.609849 0.333333 00:02
100 0.974466 2.325409 0.333333 00:02
101 0.974490 2.081188 0.366667 00:02
102 0.973824 1.359118 0.466667 00:02
103 0.972219 1.090843 0.533333 00:02
104 0.971711 1.190391 0.500000 00:02
105 0.972649 1.588080 0.433333 00:02
106 0.974650 1.835901 0.466667 00:02
107 0.973514 1.870666 0.400000 00:02
108 0.972411 1.652206 0.433333 00:02
109 0.974086 1.278639 0.666667 00:02
110 0.974933 1.040625 0.700000 00:02
111 0.972952 1.021651 0.633333 00:02
112 0.971806 1.343572 0.533333 00:02
113 0.969928 1.387259 0.500000 00:02
114 0.967772 1.570803 0.433333 00:02
115 0.965366 1.621146 0.400000 00:02
116 0.962725 1.714004 0.433333 00:02
117 0.960849 1.907067 0.466667 00:02
118 0.957004 1.591835 0.533333 00:02
119 0.958164 1.680010 0.533333 00:02
120 0.958625 1.225325 0.533333 00:02
121 0.956625 1.107982 0.600000 00:02
122 0.957369 1.121521 0.633333 00:02
123 0.955815 1.202939 0.666667 00:02
124 0.954944 1.234500 0.666667 00:02
125 0.954373 1.474302 0.566667 00:02
126 0.951846 1.589271 0.533333 00:02
127 0.951891 1.673985 0.566667 00:02
128 0.953131 1.567316 0.533333 00:02
129 0.953709 1.088511 0.633333 00:02
130 0.953542 1.128953 0.600000 00:02
131 0.952610 1.575795 0.533333 00:02
132 0.953830 1.743010 0.533333 00:02
133 0.951794 1.571326 0.500000 00:02
134 0.949696 1.577728 0.433333 00:02
135 0.952297 1.815940 0.400000 00:02
136 0.952972 2.064868 0.500000 00:02
137 0.955628 2.435957 0.400000 00:02
138 0.954675 2.641927 0.400000 00:02
139 0.954418 2.548305 0.433333 00:02
140 0.955395 1.842673 0.466667 00:02
141 0.955617 1.097640 0.700000 00:02
142 0.954719 0.925584 0.733333 00:02
143 0.956634 0.877834 0.733333 00:02
144 0.954160 0.841679 0.833333 00:02
145 0.954152 0.808143 0.833333 00:02
146 0.954123 0.787732 0.833333 00:02
147 0.951483 0.767450 0.866667 00:02
148 0.949662 0.753072 0.900000 00:02
149 0.947428 0.766489 0.833333 00:02
150 0.948786 0.804269 0.766667 00:02
151 0.946424 0.854882 0.700000 00:02
152 0.947040 0.908327 0.666667 00:02
153 0.945913 0.951015 0.666667 00:02
154 0.945124 1.022153 0.633333 00:02
155 0.943171 1.066825 0.633333 00:02
156 0.942373 1.054384 0.633333 00:02
157 0.940294 0.997566 0.633333 00:02
158 0.940491 0.971229 0.633333 00:02
159 0.939682 0.846206 0.766667 00:02
160 0.939346 0.764268 0.866667 00:02
161 0.937229 0.754534 0.833333 00:02
162 0.933547 0.787042 0.800000 00:02
163 0.933203 0.857130 0.766667 00:02
164 0.933637 0.915972 0.766667 00:02
165 0.931348 0.919065 0.766667 00:02
166 0.931428 0.859337 0.766667 00:02
167 0.929779 0.789867 0.800000 00:02
168 0.928110 0.753117 0.866667 00:02
169 0.926101 0.738458 0.866667 00:02
170 0.925353 0.755080 0.800000 00:02
171 0.926029 0.771436 0.766667 00:02
172 0.925250 0.785918 0.766667 00:02
173 0.924132 0.790192 0.766667 00:02
174 0.924556 0.791276 0.800000 00:02
175 0.923391 0.790005 0.800000 00:02
176 0.921544 0.790712 0.800000 00:02
177 0.919461 0.794095 0.800000 00:02
178 0.917881 0.796023 0.800000 00:02
179 0.917527 0.795494 0.800000 00:02
180 0.915659 0.784916 0.800000 00:02
181 0.915729 0.777742 0.800000 00:02
182 0.914645 0.770701 0.833333 00:02
183 0.913537 0.767288 0.833333 00:02
184 0.912305 0.764935 0.833333 00:02
185 0.910081 0.762620 0.833333 00:02
186 0.908561 0.763817 0.833333 00:02
187 0.909015 0.764751 0.833333 00:02
188 0.908268 0.764663 0.833333 00:02
189 0.907292 0.765498 0.833333 00:02
190 0.905085 0.766115 0.833333 00:02
191 0.904604 0.768526 0.833333 00:02
192 0.904123 0.769853 0.833333 00:02
193 0.903383 0.774133 0.800000 00:02
194 0.901851 0.778949 0.800000 00:02
195 0.901167 0.782591 0.800000 00:02
196 0.900324 0.784429 0.800000 00:02
197 0.898247 0.786330 0.800000 00:02
198 0.897429 0.791879 0.800000 00:02
199 0.897475 0.795262 0.800000 00:02
In this case, the scheduled approach didn't improve performance. This is all for now! I hope you find this as useful as I have.

Go Top
comments powered by Disqus