Affective Artificial Intelligence Lab. @Inha

Action, Cognition, Emotion in the Human Brain

학/석/박사과정 연구원 모집중. Join us.

  • Philosophy
  • Members
  • Research
  • Publications
  • Join US

Join US

JoinUS Our lab welcomes motivated and talented applicants regardless of race, ethnicity, religion, national origin, age, or disability status. We are deeply committed to fostering a collaborative, inclusive, and supportive research environment where all members can thrive and contribute meaningfully to our shared goals.

Read more about our lab philosophy.

We are always open to conversations with prospective students about their background and long-term research goals—whether they are pursuing careers in academia, industry, or other ventures.

Our lab provides rigorous training in a wide range of computational and experimental techniques, with a focus on affective intelligence.

Graduate student funding is awarded based on a combination of financial need and academic merit. (Please note that this policy does not apply to undergraduate summer/winter internship programs.)


Open Positions

We are currently seeking two (2) graduate students (Master’s, Ph.D., or Integrated MS/Ph.D. track) to join our lab. For detailed research topics, please refer to our research section and recent publications.

Qualified applicants should have:

  1. A strong motivation to engage with our lab’s research topics
  2. A solid foundation in linear algebra, probability theory, and machine learning
  3. Proficiency in Python programming (experience in C++ and ROS is valued)
  4. A basic understanding of deep learning techniques
  5. Effective communication skills in Korean or English (both written and verbal)
  6. A collaborative mindset and willingness to work with research teams from other institutions

To apply or inquire further, please contact us at affctiv.ai@gmail.com.


Undergraduate Students

Our lab is regularly open to undergraduate students during summer and winter breaks through structured internship programs(인하대학교 동/하계 학부연구생 프로그램 연계 가능).

Students who wish to apply must:

  1. Demonstrate a strong motivation to engage with our lab’s research topics
  2. Have completed (or at least audited) the course Digital Signal Processing (AIE2008) offered by the Department of Artificial Intelligence, Inha University.

Internship openings (typically 2–4 positions) are announced after midterm examinations. During the internship period, students rotate through up to three of our ongoing research projects and are expected to carry out basic computational or experimental tasks.

At the end of the program, we conduct individual meetings to discuss future opportunities for joining the lab. We strongly encourage interns to actively communicate with the PI and lab members about their research interests and ideas. Project alignment and shared values are key factors in further involvement with the lab.


Postdoctoral Fellows

Inquiries should be emailed directly to the PI, Dr. Byung Hyung Kim.

Close

Publications

  • ChaeEun Woo, SuMin Lee, Soo Min Park, Byung Hyung Kim, “RecSal-Net: Recursive Saliency Network for Video Saliency Prediction”, Neurocomputing, 2025, In press. [code]

  • Hyunwook Kang, Jin Woo Choi, Byung Hyung Kim, “Convolutional Channel Modulator for Transformer and LSTM Networks in EEG-based Emotion Recognition,” Biomedical Engineering Letters, vol.15, 2025. [code] [pdf] [link]

  • HyoSeon Choi, Dahoon Choi, Netiwit Kaongeon, Byung Hyung Kim, “Detecting Concept Shifts under Different Levels of Self-awareness on Emotion Labeling,” 27th International Conference on Pattern Recognition (ICPR), pp.276-291, Dec, 2024. [code] [pdf] [link]

  • Hyunwook Kang, Jin Woo Choi, Byung Hyung Kim, “Cascading Global and Sequential Temporal Representations with Local Context Modeling for EEG-based Emotion Recognition,” 27th International Conference on Pattern Recognition (ICPR), pp.305-320, Dec, 2024. [code] [pdf] [link]

  • 우채은, 이수민, 박수민, 최세린, 류제경, 김병형, “비디오 스윈 트랜스포머 기반의 향상된 Visual Saliency 예측” Journal of Korea Multimedia Society (멀티미디어학회논문지), vol.27, no.11, pp.1314-1325, Nov, 2024. [pdf] [link]

  • 우채은, 최효선, 김병형, “다중 출력 예측을 적용한 EEG 기반 Valence-Arousal 회귀 모델” Journal of Biomedical Engineering Research (의공학회지), vol.45, no.5, pp.279-285, Oct, 2024. [pdf] [link]

  • 방윤석, 김병형, “EEG 기반 SPD-Net에서 리만 프로크루스테스 분석에 대한 연구,” Journal of Biomedical Engineering Research (의공학회지), vol.45, no.4, pp.179-186, Aug, 2024. [pdf] [link]

  • Seunghun Koh, Byung Hyung Kim+, Sungho Jo+, “Understanding the User Perception and Experience of Interactive Algorithmic Recourse Customization,” ACM Transactions on Computer-Human Interaction (TOCHI), vol.31, no.3, 2024, +Co-Corresponding Author. [pdf] [link]

  • Kobiljon Toshnazarov, Uichin Lee, Byung Hyung Kim, Varun Mishra, Lismer Andres Caceres Najarro, Youngtae Noh, “SOSW: Stress Sensing with Off-the-Shelf Smartwatches in the Wild,” IEEE Internet of Things Journal (IoT-J), vol.11, no.12, pp.21527-21545, 2024. 2023 JCR IF:10.6, Rank:4/158=2.2% in Computer Science, Information Systems. [pdf] [link]

  • 김승한+, 진태균+, 박혜진, 정희재, 김병형, “가상현실에서 짧은 신호 길이를 활용한 시간 영역 SSVEP-BCI 시스템 속도 향상,” 한국컴퓨터종합학술대회 (KCC), pp.1185-1187, Jun, 2024. +Co-first Authors. [pdf] [link]

  • 육지훈, 김병형, “복소수 신경회로망 기반의 PPG 신호 복원 모델,” 한국컴퓨터종합학술대회 (KCC), pp.732-734, Jun, 2024. [pdf] [link]

  • 최효선, 최다훈, 김병형, “EEG 기반 감정 분류에서 MSP를 사용한 OOD 검출 적용”, KIISE Journal (정보과학논문지), vol. 51, no. 5, pp.438-444, 2024. *Invited Paper(KCC 우수논문 초청)*. [pdf] [link]

  • 강현욱, 김병형, “ConTL: CNN, Transformer 및 LSTM의결합을 통한 EEG 기반 감정인식 성능 개선”, KIISE Journal (정보과학논문지), vol. 51, no. 5, pp.454-463, 2024. [pdf] [link]

  • HyoSeon Choi, ChaeEun Woo, JiYun Kong, Byung Hyung Kim, “Multi-Output Regression for Integrated Prediction of Valence and Arousal in EEG-Based Emotion Recognition,” 12th IEEE International Winter Conference on Brain-Computer Interface, Feb, 2024. [code] [pdf] [link]

  • Yunjo Han, Kobiljon E Toshnazarov, Byung Hyung Kim, Youngtae Noh, Uichin Lee, “WatchPPG: An Open-Source Toolkit for PPG-based Stress Detection using Off-the-shelf Smartwatches,” Adjunct Proceedings of ACM International Joint Conference on Pervasive and Ubiquitous Computing (Ubicomp) & ACM International Symposium on Wearable Computing (ISWC), pp.208-209, Oct, 2023. [pdf] [link]

  • Netiwit Kaongoen, Jaehoon Choi, Jin Woo Choi, Haram Kwon, Chaeeun Hwang, Guebin Hwang, Byung Hyung Kim, Sungho Jo, “The future of wearable EEG: A review of ear-EEG technology and its applications,” Journal of Neural Engineering, vol.20, no.5, 2023. [pdf] [link]

  • Jaehoon Choi, Netiwit Kaongoen, HyoSeon Choi, Minuk Kim, Byung Hyung Kim+, Sungho Jo+, “Decoding Auditory-Evoked Response in Affective States using Wearable Around-Ear EEG System,” Biomedical Physics and Engineering Express, vol.9, no.5, pp.055029, 2023. +Co-Corresponding Author. [pdf] [link]

  • Byung Hyung Kim, Jin Woo Choi, Honggu Lee, Sungho Jo, “A Discriminative SPD Feature Learning Approach on Riemannian Manifolds for EEG Classification,” Pattern Recognition, vol. 143, no. 109751, 2023. 2022 JCR IF:8, Rank:30/275=10.7% in Engineering, Electrical & Electronic. [pdf] [link]

  • 최다훈, 최효선, 육지훈, 김병형, “EEG 기반 감정 분류에서 OOD 검출 적용,” 한국컴퓨터종합학술대회 (KCC), pp.706–708, Jun, 2023. Oral Presentation (Acceptance < 28%). *우수논문상 (Top < 8% = 53/739)*.

  • 육지훈, 주기현, 박영진, 김병형, “접촉 압력에 무관한 PPG 신호 추출 모델,” 한국컴퓨터종합학술대회 (KCC), pp.1151-1153, Jun, 2023.

  • 김태훈, 백범성, 김성언, 이은정, 안태현, 김병형, “EEG 분류를 위한 와서스테인 거리 손실을 사용한 심층 표현 기반의 도메인 적응 기법,” 한국컴퓨터종합학술대회 (KCC), pp.1042–1044, Jun, 2023. Oral Presentation (Acceptance < 28%).

  • Jin Woo Choi, Haram Kwon, Jaehoon Choi, Netiwit Kaongoen, Chaeeun Hwang, Minuk Kim, Byung Hyung Kim, Sungho Jo, “Neural Applications Using Immersive Virtual Reality: A Review on EEG Studies,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol.31, pp.1645–1658, 2023. 2022 JCR IF:4.9, Rank:4/68=5.1% in Rehabilitation. [pdf] [link]

  • Byung Hyung Kim, Sungho Jo, Sunghee Choi, “ALIS: Learning Affective Causality behind Daily Activities from a Wearable Life-Log System,” IEEE Transactions on Cybernetics, vol.52, no.12, pp.13212–13224, 2022. IF:19.118, JCR Rank:3/146=1.72% in Computer Science, Artificial Intelligence. [pdf] [link]

  • Byung Hyung Kim, Ji Ho Kwak, Minuk Kim, Sungho Jo, “Affect-driven Robot Behavior Learning System using EEG Signals for Less Negative Feelings and More Positive Outcomes,” IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), pp. 4162-4167, Sep, 2021. [pdf] [link]

  • Yoon-Je Suh, Byung Hyung Kim, “Riemannian Embedding Banks for Common Spatial Patterns with EEG-based SPD Neural Networks,” 35th AAAI Conference on Artificial Intelligence (AAAI), vol.35, no.1, pp.854–862, Feb, 2021. Acceptance Rate=21.4%, Top-tier in Computer Science. Co-first Author. Corresponding Author. [pdf] [link]

  • Byung Hyung Kim, Yoon-Je Suh, Honggu Lee, Sungho Jo, “Nonlinear Ranking Loss on Riemannian Potato Embedding,” 25th International Conference on Pattern Recognition (ICPR), pp.4348-4355, Jan, 2021. [pdf] [link]

  • Byung Hyung Kim, Seunghun Koh, Sejoon Huh, Sungho Jo, Sunghee Choi, “Improved Explanatory Efficacy on Human Affect and Workload through Interactive Process in Artificial Intelligence,” IEEE Access, vol.8, pp.189013-189024, 2020. [pdf] [link]

  • Byung Hyung Kim, Sungho Jo, Sunghee Choi, “A-Situ: a computational framework for affective labeling from psychological behaviors in real-life situations,” Scientific Reports, vol.10, 15916, Sep, 2020. [pdf]

  • Jin Woo Choi, Byung Hyung Kim, Sejoon Huh, Sungho Jo, “Observing Actions through Immersive Virtual Reality Enhances Motor Imagery Training,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol.28, no.7, pp.1614-1622, 2020. 2019 JCR IF:3.340, Rank:7/68=9.56% in Rehabilitation. Co-first Author. [pdf]

  • Byung Hyung Kim, Sungho Jo, “Deep Physiological Affect Network for the Recognition of Human Emotions,” IEEE Transactions on Affective Computing, vol.11, no.2, pp.230-243, 2020. 2019 JCR IF:7.512, Rank:11/136=7.72% in Computer Science, Artificial Intelligence. [pdf]

  • Seunghun Koh, Hee Ju Wi, Byung Hyung Kim, Sungho Jo, “Personalizing the Prediction: Interactive and Interpretable Machine Learning,” 16th IEEE International Conference on Ubiquitous Robots (UR), pp.354-359, Jun, 2019.

  • Byung Hyung Kim, Sungho Jo, “An Empirical Study on Effect of Physiological Asymmetry for Affective Stimuli in Daily Life,” 5th IEEE International Winter Workshop on Brain-Computer Interface, pp.103–105, Jan, 2017.

  • Byung Hyung Kim, Jinsung Chun, Sungho Jo, “Dynamic Motion Artifact Removal using Inertial Sensors for Mobile BCI,” 7th IEEE International EMBS Conference on Neural Engineering, pp.37–40, Apr, 2015.

  • Byung Hyung Kim, Sungho Jo, “Real-time Motion Artifact Detection and Removal for Ambulatory BCI,” 3rd IEEE International Winter Workshop on Brain-Computer Interface, pp.70–73, Jan, 2015.

  • Minho Kim, Byung Hyung Kim, Sungho Jo, “Quantitative Evaluation of a Low-cost Noninvasive Hybrid Interface based on EEG and Eye Movement,” IEEE Transactions on Neural Systems and Rehabilitation Engineering, vol.23, no.2, pp.159-168, 2015. 2014 JCR IF:3.972, Rank:3/65=4.61% in Rehabilitation.

  • Byung Hyung Kim, Minho Kim, Sungho Jo, “Quadcopter flight control using a low-cost hybrid interface with EEG-based classification and eye tracking,” Computers in Biology and Medicine, vol.51, pp.82-92, 2014. Honorable Mention Paper(Top 10%).

  • Mingyang Li, Byung Hyung Kim, Anastasios Mourikis, “Real-time Motion Tracking on a Cellphone using Inertial Sensing and a Rolling-Shutter Camera,” IEEE International Conference on Robotics and Automation (ICRA), pp.4712-4719, May, 2013.

  • Byung Hyung Kim, Hak Chul Shin, Phill Kyu Rhee, “Hierarchical Spatiotemporal Modeling for Dynamic Video Trajectory Analysis,” Optical Engineering, vol.50, no.107206, Oct, 2011.

  • Byung Hyung Kim, Danna Gurari, Hough O’Donnell, Margrit Betke, “Interactive Art System for Multiple Users Based on Tracking Hand Movements,” IADIS International Conference Interfaces and Human Computer Interaction (IHCI), Jul, 2011.

Close

Research

Our research on attention broadly addresses the complex interplay of action, cognition, and emotion components and factors that influence them in the human brain. The overarching goal is to address critical challenges to build interactive and intelligent artificial intelligence (AI) systems to discover latent relationships between the connected components. Our study includes behavioral measures coupled with eye-tracking, computational modeling, virtual reality, measures of brain activity, and neuropsychological methods. Our diverse interests and approaches result in natural collaborations. Currently, we are working with as follows:

Collaboration Our works have been published in top-tier AI conferences / journals such as AAAI, IEEE Trans. on Affective Computing, IEEE Trans. on Cybernetics.

Specific themes of our interest include:

  • Building Predictive Models of Emotion with Non-linear Data in the Human Brain
  • Developing Affect-driven Closed-Loop AI Systems
  • Learning Affective Causality behind Daily Activities
  • Controlling Machine Systems by Human Mind in Natural Environments
  • Increasing Explanability in AI Systems and Its Effects on Mental Models and Reasoning

Related popular keywords could be Affective Computing, Brain-Computer Interface (BCI), Deep Learning, Geometric, Human-Machine(Robot) Interaction, Machine Learning, Manifold Learning.

Building predictive models of emotion with non-linear data in the human brain

DPAN Manifold The predictive ability of emotional changes is a fundamental measure of affective intelligence since it enables AI systems to characterize neuropsychological activities for recognizing states of feeling. Our group aims to present promising and reliable solutions for learning non-linear data from the human brain, overcoming existing challenges induced by its non-stationary nature. We seek transdisciplinary beyond the data-driven approaches. Motivation, idea, and theoretical frameworks from psychiatry, behavioral science, geometry underlie much of our predictive models. Our scopes include but are not limited to recognize human affect, analyze the spatial-temporal hemispheric structures in different neuropsychological activities, and classify physiological data such as electroencephalogram (EEG), photoplethysmograph (PPG), electromyography (EMG), and facial expression images.

Developing Affect-driven Closed-Loop AI Systems

HRI Physiological responses are widely used for effective human feedback to develop closed-loop systems, thereby increasing the ability of human-AI communication, carrying out practical collaboration. While evoked responses have been widely used as a feedback mechanism to confirm the correctness of their responses, to provide physiological feedback for evaluating the tasks, this approach requires an end-user to be always attentive while interacting with an AI system. In addition, the amount of attention needed for decision-making increases with task difficulty, thereby decreasing human feedback quality over time because of fatigue. To overcome this limitation, our group focuses on investigating an affective process of a symbiotic relationship. By hypothesis, a successful closed-loop system should enable users to develop appropriate trust toward the AI system, by which they can subsequently increase their understanding and reduce negative feelings toward their perception of machine behavior. In turn, the AI system reflects affective feedback by changing how it makes decisions regarding the next action for producing positive outcomes. Hence, our study aims to develop a closed-loop system that learns emotional reactions to machine behaviors and provides affective feedback to optimize their parameters for smooth actions. Further, we consider how user feedback of emotion can impact the user’s affective processes in the brain associated with machine behaviors.

Learning Affective Causality behind Daily Activities

ALIS Human emotions and behaviors are reciprocal components that shape each other in everyday life. While past research on each element has made use of various physiological sensors in many ways, their interactive relationship in the context of daily life has not yet been explored. Our research aims to build interactive AI systems powered by large-scale data from users. With an unprecedented scale of users interacting with wearable technology, the system analyzes how the contexts of the user’s life affect his/her emotional changes and builds causal structures between emotions and observable behaviors in daily situations. Furthermore, we demonstrate that the proposed system enables us to build causal structures to find individual sources of mental relief suited to negative situations in real life.

Controlling Machine Systems by Human Mind in Natural Environments

ALIS Brain-computer interface (BCI) technologies has translated neural information into commands capable of controlling mahcine systems such as robot arms and drones. Can our mind connect with such AI systems easily in daily life by wearing low-cost devices? To answer this question, our research aims to develop hybrid interfaces with EEG-based classification and eye tracking and investigate the feasibilty through a Fitt’s law-based quantitative evaluation method.

Increasing Explanability in AI Systems and Its Effects on Mental Models and Reasoning

Explanatory AI systems have achieved high predictive performance with explanatory features to support their decisions, increasing algorithmic transparency and accountability in real-world environments. However, high predictive accuracy alone is insufficient. Ultimately, AI should be solving the human-agent interaction problem. By hypothesis, explanations that are succinct and easily interpretable to users should enable users to develop a highly efficient mental model. In turn, their mental model should enable them to develop appropriate trust in the AI and perform well when using the AI. The main goal of this research is to build human-interpretable machine learning systems and evaluate their explanatory efficacy along with its effects on the mental models of users.

Close

Members

Members Members Welcome to the Affective Artificial Intelligence Intelligence Lab. (affctiv.ai)

pokemon Dr. Byung Hyung Kim leads the Affective Artificial Intelligence Lab. He is currently an Assistant Professor of the Department of Artificial Intelligence at Inha University. Previously, he recieved his Ph.D in Computer Science from KAIST under the supervision of Prof. Sungho Jo. He completed his master degree in Computer Science at Boston University, working with Prof. Margrit Betke and Prof. Stan Sclaroff.

His research interests include algorithmic transparency, interpretability in affective intelligence, computational emotional dynamics, cerebral asymmetry and the effects of emotion on brain structure for affective computing, brain-computer interface, and assistive and rehabilitative technology.

He occasionally review the following journals.

  • IEEE Trans. Pattern Analysis and Machine Intelligence, IEEE Trans. on Affective Computing, IEEE Trans. on Cybernetics, IEEE Trans. on Neural Networks and Learning Systems, IEEE Trans. on Multimedia, Computers in Biology and Medicine

His CV is available here.


hwkang Kihyeon Joo, Lab Representative

  • Master’s Program, 2024.03 - Present.
  • Artificial Intelligence, Inha University
  • B.S. Information and Communication Engineering, Inha University
  • kihyeonjoo at inha.edu

jyyu Jiyoung Yu, Lab Administrator

  • Administrative Staff, 2024.03 - Present.
  • Budget, Research Agreement
  • #908, HighTech Bldg., Inha University
  • yuji0 at inha.ac.kr

hli Hanyu Li

  • Ph.D.’s Program, 2022.09 - Present.
  • Artificial Intelligence, Inha University
  • M.S. Computer Science and Technology, CQUPT, China
  • lihanyu at inha.edu

hli In-Kyung Lee

  • Ph.D.’s Program, 2024.03 - Present.
  • Artificial Intelligence, Inha University
  • M.S. Industrial Engineering, Inha University
  • 9ruddls3 at inha.edu

zhua Zhengmao Hua

  • Ph.D.’s Program, 2025.03 - Present.
  • Artificial Intelligence, Inha University
  • M.S. Electrical Engineering, Hanyang University
  • zaynhua at inha.edu

ysbang Isaac Yoon Seock Bang

  • M.S/Ph.D. Integrated Program, 2022.09 - Present.
  • Artificial Intelligence, Inha University
  • B.S. Industrial Engineering, Inha University
  • isaacrulz93 at inha.edu

woo ChaeEun Woo

  • M.S/Ph.D. Integrated Program, 2023.03 - Present.
  • Artificial Intelligence, Inha University
  • B.S. Electrical Engineering, Gangneung-Wonju National University
  • codms1440 at inha.edu

hslim Huisu Lim

  • M.S/Ph.D. Integrated Program, 2025.03 - Present.
  • Artificial Intelligence, Inha University
  • B.S. Computer Engineering, Inha University
  • hslim4922 at inha.edu

shlee Sanghyun Lee

  • Master’s Program, 2022.03 - Present.
  • Artificial Intelligence, Inha University
  • B.S. Biomedical Engineering, Korea University
  • dsbjegi at inha.edu

sjpark Sejin Park

  • Master’s Program, 2024.03 - Present.
  • Artificial Intelligence, Inha University
  • B.S. Computer Science and Engineering, Sunmoon University
  • sejinpark at inha.edu

smlee Sumin Lee

  • Master’s Program, 2024.03 - Present.
  • Artificial Intelligence, Inha University
  • B.S. Computer Science and Engineering, Kyonggi University
  • leejae7124 at inha.edu

smpark Soomin Park

  • Master’s Program, 2024.09 - Present.
  • Artificial Intelligence, Inha University
  • B.S. Information and Communication Engineering, Inha University
  • minsoominsoo at inha.edu

TaekGyun Kim

  • Master’s Program, 2024.09 - Present.
  • Artificial Intelligence, Inha University
  • B.S. Computer Engineering, Inha University
  • taekgyun.kim at inha.edu

tyoo Tae Ho Yoo

  • Master’s Program, 2025.03 - Present.
  • Artificial Intelligence, Inha University
  • B.A. Computer Science and Cognitive Science, Franklin and Marshall College, US
  • tyoo at inha.edu

mhkim Mi Hwa Kim

  • Master’s Program, 2025.03 - Present.
  • Artificial Intelligence Convergence, Inha University
  • Ph.D. Education, Incheon University
  • mihwakim at inha.ac.kr

Undergraduate Interns. 김가은, 김남형, 김승연, 소윤희, 박지우, 백재윤, 육지훈, 이제동, 이채은, 전인서, 정희재, 최우진, 홍성준


Alumni HyoSeon Choi(M.S, 2024.08), Hyunwook Kang(M.S, 2025.02), Dahoon Choi(B.S, 2024.02)@Kakao, SeungHan Kim(B.S, 2025.02)@Korea Univ., Taegyun Jin(B.S, 2025.02)@POSTECH

Close

Philosophy

“The bird fights its way out of the egg. The egg is the world. Who would be born must first destroy a world.” – Hermann Hesse, Demian

The Affective Artificial Intelligence Lab. (affctiv.ai) exists to break free from the shell of conventional science. We believe that the “egg” is not just science itself—it also represents the world we seek to change. To crack this shell, we immerse ourselves in creative projects, pursue uncompromisingly high-quality research, and tackle unique problems with methodological rigor.

We believe that noble science is only possible when grounded in shared values. Through the journey of breaking this scientific shell, we not only grow as researchers but also prepare ourselves to confront the broader challenges of the world beyond the lab.

Our lab motto, “POKEMON," captures the values we strive to embody. While we acknowledge these are aspirational ideals, we are committed to growing into them, together.

pokemon
  • Pride. We take pride in both what we achieve and how we achieve it.
  • Objectivity. We uphold scientific rigor, clarity, and reproducibility. No shortcuts.
  • Knowledge. We continuously expand our knowledge and skills, pushing beyond our current boundaries.
  • Equality. We treat everyone with equal respect and value, regardless of background.
  • Mentorship. We embrace mentorship as a mutual responsibility, offering and receiving guidance with humility.
  • Openness. We are open to all people, new ideas, and shifts in perspective.
  • Network. We foster strong connections within our team, with collaborators, and with the broader community.

Though we are not perfect, we are committed to living up to these ideals—to becoming our best version of an ideal “Pokemon.”

If you agree with our philosophy and are interested in what we’ve achieved, please read more about our open positions. Our lab welcomes applicants from all backgrounds, regardless of race, ethnicity, religion, national origin, age, or disability status. We are committed to cultivating a collaborative, inclusive, and supportive lab environment, where every member can thrive.

Close

[Github]

Affective Artificial Intelligence Laboratory

Department of Artificial Intelligence,Inha University.

d-asnaghi design