Skip to main content
Lorem ipsum dolor sit amet
Tecnológico de Monterrey Tecnológico de Monterrey
  • Living Lab
  • Data Hub
  • AIGEN
  • Calls
  • Dissemination
    • Events
    • News
    • Publications
    • YouTube Channel
  • Team
    • Data Hub Team
    • Living Lab Team
    • Special Projects

IFE Experiential Classroom - Learning Lab

The Experiential Classroom - Learning Lab is a multimodal learning analytics laboratory operated by the IFE Living Lab, located at the Expedition-FEMSA building within Monterrey's Innovation District (DistritoTec).

It consists of a 165 m2 flexible space, enabled with cutting-edge technologies and sensors, focused on projects that study interactions and experiences in dynamics that emulate real learning and collaboration contexts.

This high-tech laboratory is an exciting and disruptive environment for experimental research, promoting synergies with companies, research centers, and educational institutions.

IFE Experiential Classroom
IFE Experiential Classroom Layout

Experiential Classroom - Learning Lab offers access to more than 15 technologies and 40 peripheral sensors to collect, process, and analyze data in multiple modalities, to acquire a deeper knowledge of factors such as motivation, cognition, communication and collaboration, among others.

Examples of analyzed modalities:

  • Skeletal (postures, gestures)
  • Ocular (attention, behavior)
  • Tactile (sketching, writing)
  • Auditory (conversational patterns)
  • Physiological (altertness, stress)
  • Neuronal (concentration, mental fatigue)
  • Spatial (cohesion, movement)
Experiential Classroom Student Setup

Current Setup

Our current setup can host activities with up to 25 participants, inviting IFE LL key stakeholders to conduct studies on the features and factors present in different learning and collaboration scenarios.

Turn Your Ideas into Evidence

Are you ready to explore how people interact with technology through cutting-edge multimodal research?

👉 Contact us to bring your research or innovation project to life.

Our current setup includes:

  • 2 VR stations with fixed screens
  • 2 short-throw projectors
  • 1 85” mobile touchscreen computer
  • 1 real-time positioning system
  • 2 high performance desktop computers
  • 5 high performance laptops
  • 20 tablets
  • 1 matrix of 5 cameras, 4 speakers, and 2 environmental microphones
  • 25 chairs
  • 12 tables
  • More than 40 peripheral devices and sensors

Current Experiments

S4L-ET

We are currently developing an experiment focused on the analysis of attention and focus through eye tracking in a human-computer interaction scenario.

This study involves students interacting with a logistic simulator, where we collect precise data on where and how participants focus their gaze on real time. This information is essential to understand attention patterns and how they correlate with performance on specific tasks within the simulator for the purpose of improving user-computer interfaces to facilitate learning and more effective interaction.

imagen TEC

Previous Experiments

GTL-Mars

In 2023, we conducted a collaborative experiment focused on capturing multiple modalities for assessing remote teamwork (GTL-Mars).

The experiment involved teams of students working on the solution of an optimization problem within a hypothetical Mars colony operating a computer simulator via Zoom, where we collected raw audio and video data for the recognition of conversational patterns, emotions, and problem-solving performance.

This experiment was developed in collaboration with the Global Teamwork Lab, involving MIT, THUAS, and RUG.

GTL-Mars Experiment

NPFC-Test

In 2023, we conducted a Neuronal, Physiological, and Facial Coding experiment (NPFC-Test) that aimed at capturing multiple modalities for concentration and engagement assessment.

The NPFC-Test involved the human-computer-interaction within a 20 minute test which includes different activities with audiovisual stimuli, concentration tasks, and self-reports. We collected raw data from brainwave activity, digital biomarkers, RGB video for facial gestures recognition, and self-reports.

We are currently documenting the experiment’s dataset, which will be offered to researchers through the IFE Data Hub.

NPFC-Test Experiment

Available Technologies

Empatica EmbracePlus

Empatica Embraceplus bracelet

Device official website

  • EmbracePlus: Specifications

Azure Kinect

Azure Kinect depth camera

Device official website

  • Azure Kinect: Specifications

Muse 2

Muse 2 headband

Device official website

  • Muse 2: Specifications

Tobii Pro Spark

Tobii Pro Spark Eye Tracker

Device official website

Meta Quest 3

Meta Quest 3 headset

Device official website

Leap Motion Controller

Ultraleap Leap Motion Controller

Device official website

HTC VIVE Pro Eye

HTC VIVE Pro Eye Headset

Device official website

Focusrite Scarlett 8i6

Focusrite Scarlett 8i6 Audio Console

Device official website

Dell Alienware Aurora R13

Dell Alienware Aurora R13 desktop PC

Device official website

Dell Alienware M15 R7

Dell Alienware M15 R7

Device official website

Samsung Freestyle

Samsung Freestyle Projector

Device official website

LG StanbyME

LG StanbyME Rollable Touch Screen

Device official website

OBSBOT Tiny 4K

OBSBOT Tiny PTZ 4K Webcam

Device official website

Logo Footer Tecnológico de Monterrey
  • Living Lab
  • Data Hub
  • AIGEN
  • Calls
  • Dissemination
  • Team

Living Lab & Data Hub | Institute for the Future of Education | 
Tecnológico de Monterrey | Av. Eugenio Garza Sada 2501 Sur Col. Tecnológico C.P. 64849 |
Monterrey, Nuevo Leon, Mexico.

Legal Notice | Privacy Policies | Privacy Notices