Survei Penggunaan Tensorflow pada Machine Learning untuk Identifikasi Ikan Kawasan Lahan Basah

Nuruddin Wiranda(1*), Harja Santana Purba(2), R Ati Sukmawati(3)

(1) Prodi Pendidikan Komputer, FKIP, Universitas Lambung Mangkurat
(2) Prodi Pendidikan Komputer, FKIP, Universitas Lambung Mangkurat
(3) Prodi Pendidikan Komputer, FKIP, Universitas Lambung Mangkurat
(*) Corresponding Author


Wetlands are habitats commonly used for fish cultivation. South Kalimantan is one of the provinces that has a wetland area, which is 11,707,400ha, there are 67 rivers and an estimated 200 species of fish. This shows the abundant wealth of fish treasures and economic value. The study of fish identification is an important subject for the preservation of wetland fish. In the field of artificial intelligence, identification can be done using Machine Learning (ML). There are many libraries, a collection of functions that can be used in ML development, one of which is Tensorflow. In this paper, we survey a variety of literature on the use of Tensorflow, as well as datasets, algorithms, and methods that can be used in developing wetland area fish image identification applications.

The results of the literature survey show that Tensorflow can be used for the development of fish character identification applications. There are many datasets that can be used such as MNIST, Oxford-I7, Oxford-102, LHI-Animal-Faces, Taiwan marine fish, KTH-Animal, NASNet, ResNet, and MobileNet. Classification methods that can be used to classify fish images include CNN, R-CNN, DCNN, Fast R-CNN, kNN, PNN, Faster R-CNN, SVM, LR, RF, PCA and KFA. Tensorflow provides many models that can be used for image classification, including Inception-v3 and MobileNets, and supports models such as CNN, RNN, RBM, and DBN. To speed up the classification process, image dimensions can be reduced using the MDS, LLE, Isomap, and SE algorithms.


Machine Learning; Image Identification; Tensorflow; Image of Wetland Fish

Full Text:



[1] J. Akbar, POTENSI, PELUANG, DAN TANTANGAN PENGEMBANGAN PERIKANAN RAWA DI KALIMANTAN SELATAN, 1st ed. Banjarmasin: ULM Press. [Online]. Available: [Accessed: 29-Oct- 2020].

[2] Hatta, “Lahan Basah , Kearifan Lokal , Dan Teknologi,” Semin. Nas. ULM, pp. 7–13, 2016. [Online]. Available: [Accessed: 29-Oct-2020].

[3] S. Bahri, “Jenis-Jenis Ikan Di Sungai Sambujur, Kalimantan Selatan,” Bul. Tek. LITKAYASA Sumber Daya dan Penangkapan, vol. 7, no. 1, p. 13, 2016, doi: 10.15578/btl.7.1.2009.13-16. [Online]. Available: /index.php/btl/article/view/2507 [Accessed: 29-Oct-2020].

[4] S. Athmaja, M. Hanumanthappa, and K. Vasantha, “A Survey of Big Data Analytics Using Machine Learning Algorithms,” 2017 Int. Conf. Innov. Information, Embed. Commun. Syst., pp. 95–123, 2017, doi: 10.1109/ICIIECS.2017.8276028. [Online]. Available: [Accessed: 29-Oct-2020].

[5] A. Parvat, J. Chavan, S. Kadam, S. Dev, and V. Pathak, “A survey of deep-learning frameworks,” Proc. Int. Conf. Inven. Syst. Control. ICISC 2017, pp. 1–7, 2017, doi: 10.1109/ICISC.2017.8068684. [Online]. Available: /document/8068684 [Accessed: 29-Oct-2020].

[6] M. I. Jordan and T. M. Mitchell, “Machine learning: Trends, perspectives, and prospects,” Science, vol. 349, no. 6245. pp. 255–260, 2015, doi: 10.1126/science .aaa8415. [Online]. Available: [Accessed: 29-Oct-2020].

[7] D. Rathi, S. Jain, and S. Indu, “Underwater Fish Species Classification using Convolutional Neural Network and Deep Learning,” 2017 9th Int. Conf. Adv. Pattern Recognition, ICAPR 2017, pp. 344–349, 2018, doi: 10.1109/ICAPR.2017.8593044. [Online]. Available: [Accessed: 29-Oct-2020].

[8] V. Allken, N. O. Handegard, S. Rosen, T. Schreyeck, T. Mahiout, and K. Malde, “Fish species identification using a convolutional neural network trained on synthetic data,” ICES J. Mar. Sci., vol. 76, no. 1, pp. 342–349, 2019, doi: 10.1093/icesjms/fsy147. [Online]. Available: [Accessed: 29-Oct-2020].

[9] T. P. Pereira Padilha and L. E. Alves de Lucena, “A Systematic Review About Use of TensorFlow for Image Classification and Word Embedding in the Brazilian Context,” Acad. J. Comput. Eng. Appl. Math., vol. 1, no. 2, pp. 24–27, 2020, doi: 10.20873 /uft.2675-3588.2020.v1n2.p24-27. [Online]. Available: periodicos/index.php/AJCEAM/article/view/9466 [Accessed: 29-Oct-2020].

[10] M. Abadi et al., “TensorFlow: A System for Large-Scale Machine Learning This paper is included in the Proceedings of the TensorFlow : A system for large-scale machine learning,” 12th USENIX Conf. Oper. Syst. Des. Implement., pp. 272–283, 2016, [Online]. Available: /abadi. [Accessed: 29-Oct-2020].

[11] M. T. Islam, B. M. N. Karim Siddique, S. Rahman, and T. Jabid, “Food Image Classification with Convolutional Neural Network,” 2018 Int. Conf. Intell. Informatics Biomed. Sci. ICIIBMS 2018, vol. 3, pp. 257–262, 2018, doi: 10.1109/ICIIBMS .2018.8550005. [Online]. Available: 8550005 [Accessed: 29-Oct-2020].

[12] J. W. Gikandi, D. Morrow, and N. E. Davis, “Online formative assessment in higher education: A review of the literature,” Comput. Educ., vol. 57, no. 4, pp. 2333–2351, 2011, doi: 10.1016/j.compedu.2011.06.004. [Online]. Available: /10.1016/j.compedu.2011.06.004 [Accessed: 29-Oct-2020].

[13] L. Yuan, Z. Qu, Y. Zhao, H. Zhang, and Q. Nian, “A convolutional neural network based on TensorFlow for face recognition,” Proc. 2017 IEEE 2nd Adv. Inf. Technol. Electron. Autom. Control Conf. IAEAC 2017, pp. 525–529, 2017, doi: 10.1109/IAEAC.2017 .8054070. [Online]. Available: 8054070 [Accessed: 1-aug-2020].

[14] D. Mulfari, A. Longo Minnolo, and A. Puliafito, “Building tensor flow applications in smart city scenarios,” 2017 IEEE Int. Conf. Smart Comput. SMARTCOMP 2017, 2017, doi: 10.1109/SMARTCOMP.2017.7946991. [Online]. Available: https://ieeexplore document/7946991 [Accessed: 1-aug-2020].

[15] F. Ertam, “Data classification with deep learning using tensorflow,” 2nd Int. Conf. Comput. Sci. Eng. UBMK 2017, pp. 755–758, 2017, doi: 10.1109/UBMK.2017.8093521. [Online].Available: [Accessed:1-aug-20].

[16] X. Xia, C. Xu, and B. Nan, “Inception-v3 for flower classification,” 2017 2nd Int. Conf. Image, Vis. Comput. ICIVC 2017, pp. 783–787, 2017, doi: 10.1109/ICIVC.2017 .7984661.[Online].Available:[Accessed:1-aug-2020].

[17] N. R. Gavai, Y. A. Jakhade, S. A. Tribhuvan, and R. Bhattad, “MobileNets for flower classification using TensorFlow,” 2017 Int. Conf. Big Data, IoT Data Sci. BID 2017, vol. 2018-Janua, pp. 154–158, 2018, doi: 10.1109/BID.2017.8336590. [Online]. Available: [Accessed: 1-aug-2020].

[18] Z. Cao, J. C. Principe, B. Ouyang, F. Dalgleish, and A. Vuorenkoski, “Marine animal classification using combined CNN and hand-designed image features,” pp. 1–6, 2017, doi: 10.23919/oceans.2015.7404375. [Online]. Available: /document/7404375 [Accessed: 1-aug-2020].

[19] Y. H. S. Kumar, N. Manohar, and H. K. Chethan, “Animal classification system: A block based approach,” Procedia Comput. Sci., vol. 45, no. C, pp. 336–343, 2015, doi: 10.1016/j.procs.2015.03.156. [Online]. Available: /science/article /pii/S1877050915003920 [Accessed: 1-aug-2020].

[20] S. Taheri and Ö. Toygar, “Animal classification using facial images with score-level fusion,” IET Comput. Vis., vol. 12, no. 5, pp. 679–685, 2018, doi: 10.1049/iet-cvi.2017.0079. [Online]. Available: [Accessed: 1-aug-2020].

[21] A. Faaeq, “Image Classification using Manifold Learning Based Non-Linear Dimensionality Reduction,” 2018. [Online]. Available: .org/document /8404441 [Accessed: 1-aug-2020].

[22] E. Dandil and R. Polattimur, “PCA-Based Animal Classification System,” in ISMSIT 2018 - 2nd International Symposium on Multidisciplinary Studies and Innovative Technologies, Proceedings, 2018, pp. 1–5, doi: 10.1109/ISMSIT.2018.8567256. [Online]. Available:[Accessed:1-aug-20].

[23] N. Manohar, Y. H. S. Kumar, and G. H. Kumar, “Supervised and unsupervised learning in animal classification,” 2016 Int. Conf. Adv. Comput. Commun. Informatics, ICACCI 2016, pp. 156–161, 2016, doi: 10.1109/ICACCI.2016.7732040. [Online]. Available: [Accessed: 1-aug-2020].

[24] A. Gomez Villa, A. Salazar, and F. Vargas, “Towards automatic wild animal monitoring: Identification of animal species in camera-trap images using very deep convolutional neural networks,” Ecol. Inform., vol. 41, pp. 24–32, 2017, doi: 10.1016/j.ecoinf.2017 .07.004. [Online]. Available: /science/article/abs/pii/S1574954116302047 [Accessed: 1-aug-2020].

[25] L. Bloch et al., “Combination of image and location information for snake species identification using object detection and EfficientNets Combination of image and location information for snake species identification using object detection and EfficientNets FHDO Biomedical,” Conf. Labs Eval. Forum, vol. 2696, 2020, [Online]. Available: [Accessed: 1-aug-2020].

[26] G. G. Monkman, K. Hyder, M. J. Kaiser, and F. P. Vidal, “Using machine vision to estimate fish length from images using regional convolutional neural networks,” Methods Ecol. Evol., vol. 10, no. 12, pp. 2045–2056, 2019, doi: 10.1111/2041-210X.13282. [Online]. Available: /fiducial_machine_vision_mee_r3.pdf [Accessed: 29-Oct-2020].

[27] S. Mahavarkar,Avinash ; Kadwadkar, Ritika; Maurya, Sneha; Raveendran, “Underwater Object Detection using Tensorflow Avinash,” ITM Web Conf., vol. 8, no. 5, pp. 2091–2095, 2020, doi: 10.22214/ijraset.2020.5344. [Online]. Available: [Accessed: 29-Oct-2020].

[28] V. Karthikeyan, K. Raj Kumar, D. Vijay, and S. Aravind, “Underwater Object Detection,” Int. J. Res. Appl. Sci. Eng. Technol., vol. 8, no. V, pp. 2091–2095, 2020, doi: [Online]. Available: /fileserve.php?FID=28719.[Accessed: 29-Oct-2020].


Article Metrics

Abstract views : 4854 | views : 4748


  • There are currently no refbacks.

Copyright (c) 2020 IJEIS (Indonesian Journal of Electronics and Instrumentation Systems)

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Copyright of :
IJEIS (Indonesian Journal of Electronics and Instrumentations Systems)
ISSN 2088-3714 (print); ISSN 2460-7681 (online)
is a scientific journal the results of Electronics
and Instrumentations Systems
A publication of IndoCEISS.
Gedung S1 Ruang 416 FMIPA UGM, Sekip Utara, Yogyakarta 55281
Fax: +62274 555133 |

View My Stats1
View My Stats2