Tinjauan Perkembangan Kecerdasan Buatan Berbasis Arsitektur Transformer

Authors

  • Bayu Firmanto a:1:{s:5:"en_US";s:13:"Wisnuwardhana";}
  • As'ad Shidqy Aziz
  • Jendra Sesoca

DOI:

https://doi.org/10.21067/smartics.v10i1.8351

Abstract

Artificial Intelligence, especially technique utilizing machine learning using transformer architecture has experienced rapid progress. The transformer architecture was first introduced in 2017 and laid the foundation for the development of larger and more accurate models in NLP, some of which use BERT and GPT. This review examines five studies that have made significant contributions to the development of the transformer architecture, including research by Vaswani, Devlin, Brown, and Dai. The results of this study shows that the transformer architecture is capable of improving training efficiency, accuracy, and long-context understanding in various NLP tasks. However, there are still some issues with this technology that need to be addressed further.

Published

2024-09-02

How to Cite

[1]
B. Firmanto, As'ad Shidqy Aziz, and Jendra Sesoca, “Tinjauan Perkembangan Kecerdasan Buatan Berbasis Arsitektur Transformer ”, SMARTICS, vol. 10, no. 1, pp. 33–38, Sep. 2024.

Most read articles by the same author(s)