Masayu Leylia Khodra

Work place: School of Electrical Engineering and Informatics, Institut Teknologi Bandung (ITB), Bandung, Indonesia



Research Interests:


Masayu Leylia Khodra received Bachelor degree in Department of Informatics, School of Electrical and Informatics, Institut Teknologi Bandung (ITB), Bandung, Indonesia, in 1998, and the M.Eng degree in Informatic Engineering from Institut Teknologi Bandung (ITB), Bandung, Indonesia in 2006. Doctoral degree, She received in the same institute, in 2012. Since 2008 until now, She is active as a lecturer at Institut Teknologi Bandung (ITB), Bandung, Indonesia. She currently works at Institut Teknologi Bandung (ITB) as a lecturer and associate professor.

Author Articles
Implementation of Transfer Learning Using VGG16 on Fruit Ripeness Detection

By Jasman Pardede Benhard Sitohang Saiful Akbar Masayu Leylia Khodra

DOI:, Pub. Date: 8 Apr. 2021

In previous studies, researchers have determined the classification of fruit ripeness using the feature descriptor using color features (RGB, GSL, HSV, and L * a * b *). However, the performance from the experimental results obtained still yields results that are less than the maximum, viz the maximal accuracy is only 76%. Today, transfer learning techniques have been applied successfully in many real-world applications. For this reason, researchers propose transfer learning techniques using the VGG16 model. The proposed architecture uses VGG16 without the top layer. The top layer of the VGG16 replaced by adding a Multilayer Perceptron (MLP) block. The MLP block contains Flatten layer, a Dense layer, and Regularizes. The output of the MLP block uses the softmax activation function. There are three Regularizes that considered in the MLP block namely Dropout, Batch Normalization, and Regularizes kernels. The Regularizes selected are intended to reduce overfitting. The proposed architecture conducted on a fruit ripeness dataset that was created by researchers. Based on the experimental results found that the performance of the proposed architecture has better performance. Determination of the type of Regularizes is very influential on system performance. The best performance obtained on the MLP block that has Dropout 0.5 with increased accuracy reaching 18.42%. The Batch Normalization and the Regularizes kernels performance increased the accuracy amount of 10.52% and 2.63%, respectively. This study shows that the performance of deep learning using transfer learning always gets better performance than using machine learning with traditional feature extraction to determines fruit ripeness detection. This study gives also declaring that Dropout is the best technique to reduce overfitting in transfer learning.

[...] Read more.
Other Articles