Loading Now

Summary of Analyzing Multi-head Attention on Trojan Bert Models, by Jingwei Wang


Analyzing Multi-Head Attention on Trojan BERT Models

by Jingwei Wang

First submitted to arxiv on: 12 Jun 2024

Categories

  • Main: Computation and Language (cs.CL)
  • Secondary: Artificial Intelligence (cs.AI)

     Abstract of paper      PDF of paper


GrooveSquid.com Paper Summaries

GrooveSquid.com’s goal is to make artificial intelligence research accessible by summarizing AI papers in simpler terms. Each summary below covers the same AI paper, written at different levels of difficulty. The medium difficulty and low difficulty versions are original summaries written by GrooveSquid.com, while the high difficulty version is the paper’s original abstract. Feel free to learn from the version that suits you best!

Summary difficulty Written by Summary
High Paper authors High Difficulty Summary
Read the original abstract here
Medium GrooveSquid.com (original content) Medium Difficulty Summary
A novel study delves into the workings of multi-head attention in Transformer models, focusing on the disparity between benign and trojan models when applied to sentiment analysis tasks. The research explores how trojan attacks manipulate models to behave normally with clean inputs but exhibit misclassifications when encountering predefined triggers. By analyzing the attention head functions in both trojan and benign models, the study identifies specific ‘trojan’ heads and examines their behavior.
Low GrooveSquid.com (original content) Low Difficulty Summary
Transformer models are used for sentiment analysis tasks, but what happens when they’re attacked? This study looks at how “trojan” attacks affect multi-head attention in Transformer models. Trojan attacks make models behave normally with normal inputs, but misclassify when they see special triggers. The researchers looked at the attention heads in both regular and trojan models to figure out what’s going on.

Keywords

» Artificial intelligence  » Attention  » Multi head attention  » Transformer