Preprint
Emergence of Primacy and Recency Effect in Mamba: A Mechanistic Point of View
Muhammad Cendekia Airlangga, Hilal AlQuabeh, Munachiso S Nwadike, Kentaro Inui
arXiv 2025 [LINK]
I study memory in state-space language models by probing primacy and recency effects on the Mamba architecture. My work reveals a U-shaped recall pattern: early tokens are preserved by sparse long-term channels, recent tokens by delta-modulated recurrence, while intermediate items are often forgotten depending on semantic regularity. Validated on large-scale Mamba models, these findings show how information is retained, lost, and dynamically reallocated in modern language models.
Conference
Understanding Citizen Feedback of Jakarta Government Super App: Leveraging Deep Learning Models
Muhammad Cendekia Airlangga, Andi Sulasikin, Yudhistira Nugraha, Nur Laily Romadhotul Husna, Muhamad Erza Aminanto, Fajar Kurniawan, Juan Intan Kanggrawan
IEEE ISC2 2023 [LINK]
I conducted a sentiment analysis study on Jakarta Kini (JAKI), a mobile application developed by the Jakarta Government to connect residents with public services. By combining word cloud analysis with advanced deep learning models such as LSTM, BiLSTM, GRU, BiGRU, and IndoBERT, I examined user reviews to uncover both satisfaction trends and areas for improvement. The results highlighted IndoBERT’s superior performance, demonstrating its effectiveness for Indonesian sentiment analysis and setting a strong reference point for analyzing user feedback in JAKI as well as other mobile applications.