Comprehensive Summary
This study introduces Concept2Brain, an artificial intelligence system that predicts how brain activity unfolds when someone views an image or reads a short passage using event-related potentials from EEG that measure perception, attention, and emotion. It first captures the meaning of the input with a vision and language encoder and then translates that concept signal into an EEG representation learned from human recordings through a conditioned variational autoencoder, which organizes activity into a compact latent space and allows the model to generate full multi-channel waveforms for any virtual participant that resemble real responses. The synthetic signals recover familiar patterns from the literature, including early sensory peaks, a classic attention-related positivity, and a late positive potential that increases with emotional content. The model also reproduces a face-selective response with the expected right posterior scalp distribution that aligns with the N170 benchmark in face perception studies. The system maintains realistic differences across individuals rather than collapsing to a single average, and it works for both images and short text prompts that describe affective scenes. The team provides a cloud platform where users can upload images or enter text, select virtual participants, visualize scalp activity over time, and download data, allowing open and reproducible in silico EEG datasets without running a full experiment for each new stimulus set.
Outcomes and Implications
Concept2Brain offers teams a quick way to preview how candidate stimuli will influence attention and emotion in EEG before recruiting participants. It helps refine protocols, focus electrode placement and analysis windows, and plan power and simulations. Since it replicates well-known markers like the late positive potential and the face-selective N170 with realistic scalp maps, it provides useful reference patterns for assessing affective processing and social perception using images or short text. The web platform reduces barriers by creating large, flexible datasets for many virtual participants, allowing smaller groups to explore hypotheses and test pipelines without immediate data collection. Used as an open companion to empirical studies, it bridges semantic content with measurable brain dynamics, speeding up the development of reliable EEG markers while keeping studies efficient, reproducible, and accessible.