Tinjauan Perkembangan Kecerdasan Buatan Berbasis Arsitektur Transformer
DOI:
https://doi.org/10.21067/smartics.v10i1.8351Abstract
Artificial Intelligence, especially technique utilizing machine learning using transformer architecture has experienced rapid progress. The transformer architecture was first introduced in 2017 and laid the foundation for the development of larger and more accurate models in NLP, some of which use BERT and GPT. This review examines five studies that have made significant contributions to the development of the transformer architecture, including research by Vaswani, Devlin, Brown, and Dai. The results of this study shows that the transformer architecture is capable of improving training efficiency, accuracy, and long-context understanding in various NLP tasks. However, there are still some issues with this technology that need to be addressed further.
Downloads
Published
How to Cite
Issue
Section
License
Copyright (c) 2024 Bayu Firmanto, As'ad Shidqy Aziz, S.T., M.T., Jendra Sesoca, S.T,M.T
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.