IEICE Technical Committee Submission System
Conference Paper's Information
Online Proceedings
[Sign in]
Tech. Rep. Archives
 Go Top Page Go Previous   [Japanese] / [English] 

Paper Abstract and Keywords
Presentation 2022-12-16 15:10
[Short Paper] Cosine Similarity Based Attention on a Hypersphere for Vision Transformers
Jungdae Lee, Rei Kawakami, Nakamasa Inoue (Tokyo Tech) PRMU2022-52
Abstract (in Japanese) (See Japanese page) 
(in English) The success of Vision Transformers in computer vision is usually attributed to its distinct structure of self-attention architecture. However, it has been pointed out by a number of researchers that self-attention in Vision Transformer can be replaced with more general structures that involve mixing intermediate tokens. In this paper, we propose an architecture that generalizes self-attention block of Vision Transformer as a loss function combined with a classifier. On top of this, we present an instance of the architecture that employs a combination of logistic regression losses in conjunction with a linear classifier, which relies on cosine similarity instead of dot product. In the experiment section of the paper, we compared the proposed model with DeiT, an effective light-weight Transformer, on classification accuracy for CIFAR10 and CIFAR100 dataset without external training data. As a result, we have discovered that the proposed model performs better than DeiT under certain settings.
Keyword (in Japanese) (See Japanese page) 
(in English) Vision Transformer / attention / loss function / L2 normalization / / / /  
Reference Info. IEICE Tech. Rep., vol. 122, no. 314, PRMU2022-52, pp. 106-109, Dec. 2022.
Paper # PRMU2022-52 
Date of Issue 2022-12-08 (PRMU) 
ISSN Online edition: ISSN 2432-6380
Copyright
and
reproduction
All rights are reserved and no part of this publication may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopy, recording, or any information storage and retrieval system, without permission in writing from the publisher. Notwithstanding, instructors are permitted to photocopy isolated articles for noncommercial classroom use without fee. (License No.: 10GA0019/12GB0052/13GB0056/17GB0034/18GB0034)
Download PDF PRMU2022-52

Conference Information
Committee PRMU  
Conference Date 2022-12-15 - 2022-12-16 
Place (in Japanese) (See Japanese page) 
Place (in English) Toyama International Conference Center 
Topics (in Japanese) (See Japanese page) 
Topics (in English)  
Paper Information
Registration To PRMU 
Conference Code 2022-12-PRMU 
Language English (Japanese title is available) 
Title (in Japanese) (See Japanese page) 
Sub Title (in Japanese) (See Japanese page) 
Title (in English) Cosine Similarity Based Attention on a Hypersphere for Vision Transformers 
Sub Title (in English)  
Keyword(1) Vision Transformer  
Keyword(2) attention  
Keyword(3) loss function  
Keyword(4) L2 normalization  
Keyword(5)  
Keyword(6)  
Keyword(7)  
Keyword(8)  
1st Author's Name Jungdae Lee  
1st Author's Affiliation Tokyo Institute of Technology (Tokyo Tech)
2nd Author's Name Rei Kawakami  
2nd Author's Affiliation Tokyo Institute of Technology (Tokyo Tech)
3rd Author's Name Nakamasa Inoue  
3rd Author's Affiliation Tokyo Institute of Technology (Tokyo Tech)
4th Author's Name  
4th Author's Affiliation ()
5th Author's Name  
5th Author's Affiliation ()
6th Author's Name  
6th Author's Affiliation ()
7th Author's Name  
7th Author's Affiliation ()
8th Author's Name  
8th Author's Affiliation ()
9th Author's Name  
9th Author's Affiliation ()
10th Author's Name  
10th Author's Affiliation ()
11th Author's Name  
11th Author's Affiliation ()
12th Author's Name  
12th Author's Affiliation ()
13th Author's Name  
13th Author's Affiliation ()
14th Author's Name  
14th Author's Affiliation ()
15th Author's Name  
15th Author's Affiliation ()
16th Author's Name  
16th Author's Affiliation ()
17th Author's Name  
17th Author's Affiliation ()
18th Author's Name  
18th Author's Affiliation ()
19th Author's Name  
19th Author's Affiliation ()
20th Author's Name  
20th Author's Affiliation ()
Speaker Author-1 
Date Time 2022-12-16 15:10:00 
Presentation Time 10 minutes 
Registration for PRMU 
Paper # PRMU2022-52 
Volume (vol) vol.122 
Number (no) no.314 
Page pp.106-109 
#Pages
Date of Issue 2022-12-08 (PRMU) 


[Return to Top Page]

[Return to IEICE Web Page]


The Institute of Electronics, Information and Communication Engineers (IEICE), Japan