4  CONCLUSION AND FUTURE 
WORK 
In  this  paper,  a  simple  and  effective  filter  pruning 
method  based  on  filter  correlation  analysis  is 
proposed. The new method searches for  a  subset of 
filters that can reliably and adequately represent the 
structure of the original model. The proposed method 
iteratively  adds  filters  with  better  representative 
ability  and  less  redundancy  into  the  final  set  of 
retained  filters,  discarding  the  others.  Unlike  the 
existing  norm  based  criterion, the  proposed  method 
explicitly considers the correlation among filters. The 
pruned  model  with  the  proposed  method  learns 
effectively  with  few  filters.  Thus,  when  pruning  a 
TernausNet  trained  on  the  INRIA  dataset  by  the 
proposed method, FLOPs reduction rates are as high 
as 89.65% accompanied  by a negligible drop in the 
Val.  Acc.  ( <2% ).  The  experimental  analysis  on 
TernausNet and U-Net confirms the robustness of the 
proposed approach. However, iterative searching for 
the representative filters takes some good amount of 
time. Therefore, it will be our future work to explore 
a way to render the method faster. 
ACKNOWLEDGEMENTS 
The  authors  would  like  to  thank  the  Fraunhofer 
Institute  for  Integrated  Circuits  for  providing 
infrastructure for carrying out this research work and 
the  European  Research  Consortium  for  Informatics 
and Mathematics (ERCIM) for the award of Research 
Fellowship. 
REFERENCES 
Ahmadi, Mahdi, Alireza Norouzi, Nader Karimi, Shadrokh 
Samavi, and Ali Emami. 2020. “ReDMark : Framework 
for  Residual  Diffusion  Watermarking  Based  on  Deep 
Networks” 146. https://doi.org/10.1016/j.eswa.2019.11 
3157. 
Han, Song, Huizi Mao, and William J. Dally. 2016. “Deep 
Compression:  Compressing  Deep  Neural  Networks 
with  Pruning,  Trained  Quantization  and  Huffman 
Coding.”  4th International Conference on Learning 
Representations, ICLR 2016 - Conference Track 
Proceedings, 1–14. 
Han,  Song,  Jeff  Pool,  John  Tran,  and  William  J.  Dally. 
2015.  “Learning  Both  Weights  and  Connections  for 
Efficient  Neural  Networks.”  Advances in Neural 
Information Processing Systems 2015-Janua: 1135–43. 
He, Yang, Ping Liu, Ziwei Wang, Zhilan Hu, and Yi Yang. 
2019. “Filter Pruning via Geometric Median for Deep 
Convolutional  Neural  Networks  Acceleration.” 
Proceedings of the IEEE Computer Society Conference 
on Computer Vision and Pattern Recognition  2019-
June:  4335–44.  https://doi.org/10.1109/CVPR.2019.0 
0447. 
Iglovikov,  Vladimir,  and  Alexey  Shvets.  2018. 
“TernausNet: U-Net with VGG11 Encoder Pre-Trained 
on Imagenet for Image Segmentation.” ArXiv. 
Jang,  Yunseok,  Sangyoun  Lee,  and  Jaeseok  Kim.  2021. 
“Compressing  Convolutional  Neural  Networks  by 
Pruning Density  Peak Filters.” IEEE Access 9:  8278–
85. https://doi.org/10.1109/ACCESS.2021.3049470. 
Li,  Hao  et  al.  2017.  “Pruning  Filters  for  Efficient 
Convnets.” 5th International Conference on Learning 
Representations, ICLR 2017 - Conference Track 
Proceedings (2016): 1–13. 
Liao, Xin, Kaide Li, Xinshan Zhu, and K. J.Ray Liu. 2020. 
“Robust Detection of Image Operator Chain with Two-
Stream Convolutional Neural Network.” IEEE Journal 
on Selected Topics in Signal Processing  14 (5): 955–
68. https://doi.org/10.1109/JSTSP.2020.3002391. 
Liu, Chih Ting, Tung Wei Lin, Yi Heng Wu, Yu Sheng Lin, 
Heng  Lee,  Yu  Tsao,  and  Shao  Yi  Chien.  2019. 
“Computation-Performance  Optimization  of 
Convolutional Neural Networks with Redundant Filter 
Removal.” IEEE Transactions on Circuits and Systems 
I: Regular Papers  66  (5):  1908–21.  https://doi.org/ 
10.1109/TCSI.2018.2885953. 
Liu, Na, Lihong Wan, Yu Zhang, Tao Zhou, Hong Huo, and 
Tao  Fang.  2018.  “Exploiting  Convolutional  Neural 
Networks With Deeply Local Description for Remote 
Sensing  Image  Classification.”  IEEE Access 6 (c): 
11215–27.  https://doi.org/10.1109/ACCESS.2018.279 
8799. 
Liu, Xuefeng, Qiaoqiao Sun, Bin Liu, Biao Huang, and Min 
Fu.  2017.  “Hyperspectral  Image  Classification  Based 
on  Convolutional  Neural  Network  and  Dimension 
Reduction,” no. 61401244: 1686–90. 
Lunga,  Dalton,  Hsiuhan  Lexie  Yang,  Andrew  Reith, 
Jeanette  Weaver,  Jiangye  Yuan,  and  Budhendra 
Bhaduri.  2018.  “Domain-Adapted  Convolutional 
Networks for  Satellite  Image  Classification: A Large-
Scale Interactive Learning Workflow.” IEEE Journal of 
Selected Topics in Applied Earth Observations and 
Remote Sensing  11  (3):  962–77.  https://doi.org/10.11 
09/JSTARS.2018.2795753. 
Luo, Jian Hao, and Jianxin Wu. 2017. “An Entropy-Based 
Pruning Method for CNN Compression.” ArXiv. 
Ma, Xiaolong, Sheng Lin, Shaokai Ye, Zhezhi He, Linfeng 
Zhang, Geng Yuan, Sia  Huat  Tan,  et  al. 2021. “Non-
Structured  DNN  Weight  Pruning--Is  It  Beneficial  in 
Any  Platform?”  IEEE Transactions on Neural 
Networks and Learning Systems, 1–15. https://doi.org/ 
10.1109/TNNLS.2021.3063265. 
Maggiori,  Emmanuel,  Yuliya  Tarabalka,  Guillaume 
Charpiat,  and  Pierre  Alliez.  2017.  “Can  Semantic 
Labeling Methods Generalize to Any City? The INRIA 
Aerial  Image  Labeling  Benchmark.”  International