BERT-based Classification Model for Korean Documents
It is necessary to classify technical documents such as patents, R&D project reports in order to understand the trends of technology convergence and interdisciplinary joint research, technology development and so on. Text mining techniques have been mainly used to classify these technical documents. However, in the case of classifying technical documents by text mining algorithms, there is a disadvantage that the features representing technical documents must be directly extracted. In this study, we propose a BERT-based document classification model to automatically extract document features from text information of national R&D projects and to classify them. Then, we verify the applicability and performance of the proposed model for classifying documents.
Devlin, J., Chang, M. W., and Lee, K. T., “BERT: Pre-training of deep bidirectional transformers for language understanding,” arXiv:1810.04805, 2018.
Jo, H., Kim, J. H., Yoon, S., Kim, K. M., and Zhang, B. T., “Large-scale text classification methodology with convolutional neural network,” Proceedings of the 2015 Korean Information Science Society Conference, pp. 792-794, 2015.
Kim, J. M. and Lee, J. H., “Text document classification based on recurrent neural network using word2vec,” Journal of Korean Institute of Intelligent Systems, Vol. 27, No. 6, 2017.
Kim, Y., “Convolutional neural network for sentence classification,” Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), pp. 1746-1751, 2014.
Kim, Y. J., Kim, T. H., Lim, C. S., and Kim, J. S., “A study on NTIS standard code and classification service development,” Proceedings of the 2007 Korea Contents Association Conference, pp. 376-380, 2007.
Kingma, D. and Ba, J., “Adam: A method for stochastic optimization,” Proceedings of the 3rd International Conference on Learning Representations, 2015.
Oh, S. W., Lee, H., Shin, J. Y., and Lee, J. H., “Antibiotics-resistant bacteria infection prediction based on deep learning,” The Journal of Society for e-Business Studies, Vol. 24, No. 1, pp. 105-120, 2019.
Srivastava, N., Hinton, G., krizhevsky, A., Sutskever, I., and Salakhutdinov, R., “Dropout: A simple way to prevent neural networks from overfitting,” Journal of Machine Learning Research, Vol. 15, pp. 1929-1958, 2014.
Vaswani, A., Shazeer, N., Parmar, N., Uszkoreit, J., Jones, L., Gomez, A. N., Kaiser, L., and Polosukhin, I., “Attention is all you need,” Proceedings of the 31st Conference on Neural Information Processing Systems, 2017.
Yang, Y. J., Lee, B. H., Kim, J. S., and Lee, K. Y., “Development of an automatic classification system for game reviews based on word embedding and vector similarity,” The Journal of Society for e-Business Studies, Vol. 24, No. 2, pp. 1-14, 2019.
Yoon, D., Kim, S., and Kim, D., “Clustering of time series data using deep learning,” Journal of Applied Reliability, Vol. 19, No. 2, pp. 167-178, 2019.
Young, T., Hazarika, D., Poria, S., and Cambria, E., “Recent trends in deep learning based natural language processing,” arXiv:1708.02709, 2017.
- There are currently no refbacks.