Skip to main content

Introduction to Explainable Deep Learning (XAI)Laajuus (3 cr)

Code: TX00FT50

Credits

3 op

Objective

Convolutional neural networks (CNN), the representative technology of deep learning, have the advantage of high performance, but the disadvantage of a black box for the reason for the output. For this reason, the application of CNNs has often been discouraged in fields where transparency of explanation is important, such as the medical field. Against this background, Class Activation Map (CAM) and Regression Activation Map (RAM) were proposed as methods to visualize the reasons for CNN output. However, CAM and RAM have the disadvantage that they are only applicable to simple CNN architectures. Therefore, Grad-CAM and Grad-RAM were newly proposed as applicable methods for complex CNN architectures. These methods have allowed deep learning to be used in areas where transparency is important. Therefore, machine learning engineers had better acquire this skill. In this lecture, the theory and implementation of CAM, RAM, Grad-CAM, and Grad-RAM will be explained as methods to white box CNNs.
In this course, students will acquire the theory and implementation of CAM, RAM, Grad-CAM, and Grad-RAM as methods to white box CNNs.

Content

Implementing CNNs using Python and Keras
How to implement CAM in Classification CNNs
How to implement RAM in Regression CNNs
How to implement Grad-CAM in Classification CNNs
How to implement Grad-RAM in Regression CNNs

Prerequisites

Theory of basic neural networks
Differentiation (in particular, the physical meaning of partial derivatives)
Linear algebra (matrix and vector calculations)
Experience with Python and Google Colaboratory

Assessment criteria, satisfactory (1)

Programming and reports will be assigned as daily tasks. And, your grades will be determined by these qualities.

Assessment criteria, approved/failed

Programming and reports will be assigned as daily tasks. And, your grades will be determined by these qualities.

Enrollment

06.05.2024 - 14.08.2024

Timing

19.08.2024 - 23.08.2024

Number of ECTS credits allocated

3 op

Mode of delivery

Contact teaching

Unit

School of ICT

Campus

Leiritie 1

Teaching languages
  • English
Seats

0 - 24

Degree programmes
  • Degree Programme in Information Technology
Teachers
  • Yuto Omae
Groups
  • ICTSUMMER
    ICT Summer School

Objective

Convolutional neural networks (CNN), the representative technology of deep learning, have the advantage of high performance, but the disadvantage of a black box for the reason for the output. For this reason, the application of CNNs has often been discouraged in fields where transparency of explanation is important, such as the medical field. Against this background, Class Activation Map (CAM) and Regression Activation Map (RAM) were proposed as methods to visualize the reasons for CNN output. However, CAM and RAM have the disadvantage that they are only applicable to simple CNN architectures. Therefore, Grad-CAM and Grad-RAM were newly proposed as applicable methods for complex CNN architectures. These methods have allowed deep learning to be used in areas where transparency is important. Therefore, machine learning engineers had better acquire this skill. In this lecture, the theory and implementation of CAM, RAM, Grad-CAM, and Grad-RAM will be explained as methods to white box CNNs.
In this course, students will acquire the theory and implementation of CAM, RAM, Grad-CAM, and Grad-RAM as methods to white box CNNs.

Content

Implementing CNNs using Python and Keras
How to implement CAM in Classification CNNs
How to implement RAM in Regression CNNs
How to implement Grad-CAM in Classification CNNs
How to implement Grad-RAM in Regression CNNs

Evaluation scale

0-5

Assessment criteria, satisfactory (1)

Programming and reports will be assigned as daily tasks. And, your grades will be determined by these qualities.

Assessment criteria, approved/failed

Programming and reports will be assigned as daily tasks. And, your grades will be determined by these qualities.

Prerequisites

Theory of basic neural networks
Differentiation (in particular, the physical meaning of partial derivatives)
Linear algebra (matrix and vector calculations)
Experience with Python and Google Colaboratory