Running SPM First-Level Analyses from the Command Line

For most people, doing first-level analyses in SPM is tedious in the extreme; for other people, these analyses are manageable through command line scripting in Matlab. And there are still other people who don't even think about first-level analyses or SPM or Matlab or any of that. This last class of people are your college classmates who majored in something else, like economics, and are living significantly less stressful lives than you, making scads of money, and regularly eating out at restaurants where they serve shrimp the size of Mike Tyson's forearm.

Assuming that you belong to the first category, however, first-level analyses are invariably a crashing bore; and, worse, completely impractical for something like beta-series analysis, where a separate regressor is needed for individual events.

The following script will do all of that for you, although it makes many assumptions about your timing files and where everything is stored. First, your directory needs to be set up with a root directory containing all of the subjects, as well as a separate timing directory containing your onsets files; and second, that the timing files are written in a four-column format of runs, regressors, onsets (in seconds), and duration. Once you have all of that, though, all it takes is simply modifications of the script to run your analysis.

As always, let me know if this actually helps, or whether there are sections of code that can be cleaned up. I will be making a modification tomorrow allowing for beta series analysis by converting all of the timing events into individual regressors, which should help things out.

Link to script: http://mypage.iu.edu/~ajahn/docs/Run_SPM_FirstLevel.m

% INPUT:
%  rootDir        - Path to where subjects are located
%  subjects       - Vector containing list of subjects (can concatenate with brackets)
%  spmDir         - Path to folder where you want to move the multiple conditions .mat files and where to the output SPM file (directory is created if it doesn't
%  exist)
%  timingDir      - Path to timing directory, relative to rootDir
%  timingSuffix   - Suffix appended to timing files
%  matPrefix      - Will be prefixed to output .mat files.
%  dataDir        - Directory storing functional runs.
%
% OUTPUT:
%  One multiple condition .mat file per run. Also specifies the GLM and runs
%  beta estimation.
%       
%
%       Note that this script assumes that your timing files are organized
%       with columns in the following order: Run, ConditionName, Onset (in
%       seconds), Duration (in seconds)
%
%       Ideally, this timing file should be written out from your
%       presentation software, e.g., E-Prime
%
%       Also note how the directory tree is structured in the following example. You may have to
%       change this script to reflect your directory structure.
%       
%       Assume that:
%         rootdir = '/server/studyDirectory/'
%         spmDir = '/modelOutput/'
%         timingDir = '/timings/onsets/'
%         subjects = [101]
%         timingSuffix = '_timing.txt'
%       1) Sample path to SPM directory:
%       '/server/studyDirectory/101/modelOutput/'
%       2) Sample path to timing file in the timing directory:
%       '/server/studyDirectory/timings/onsets/101_timing.txt'
%
% Andrew Jahn, Indiana University, June 2014
% andysbrainblog.blogspot.com

clear all

%%%----Things you will want to change for your study----%%%

rootDir = '/data/hammer/space5/ImagineMove/fmri/';
subjects = [214 215];
spmDir = '/RESULTS/testMultis/';
timingDir = 'fMRI_behav/matlab_input/';
timingSuffix = '_spm_mixed.txt';
matPrefix = 'testMulti';
dataDir = '/RESULTS/data/';

%Change these parameters to reflect study-specific information about number
%of scans and discarded acquisitions (disacqs) per run
funcs = {'swuadf0006.nii', 'swuadf0007.nii', 'swuadf0008.nii'}; %You can add more functional runs; just remember to separate them with a comma
numScans = [240 240 240]; %This if the number of scans that you acquire per run
disacqs = 2; %This is the number of scans you later discard during preprocessing
numScans = numScans-disacqs;
TR = 2; %Repetition time, in seconds

%Estimating the GLM can take some time, particularly if you have a lot of betas. If you just want to specify your
%design matrix so that you can assess it for singularities, turn this to 0.
%If you wish to do it later, estimating the GLM through the GUI is very
%quick.
ESTIMATE_GLM = 1;


%%%-----------------------------------------------------%%%



%For each subject, create timing files and jobs structure
for subject = subjects
    
    %See whether output directory exists; if it doesn't, create it
    outputDir = [rootDir num2str(subject) spmDir];
    
    if ~exist(outputDir)
        mkdir(outputDir)
    end
     
    %---Navigate to timing directory and create .mat files for each run---%
    cd([rootDir timingDir]);
    
    fid = fopen([num2str(subject) timingSuffix], 'rt');
    T = textscan(fid, '%f %s %f %f', 'HeaderLines', 1); %Columns should be 1)Run, 2)Regressor Name, 3) Onset Time (in seconds, relative to start of each run), and 4)Duration, in seconds
    fclose(fid);
    
    clear runs nameList names onsets durations sizeOnsets %Remove these variables if left over from previous analysis
    
    runs = unique(T{1});

    %Begin creating jobs structure
    jobs{1}.stats{1}.fmri_spec.dir = cellstr(outputDir);
    jobs{1}.stats{1}.fmri_spec.timing.units = 'secs';
    jobs{1}.stats{1}.fmri_spec.timing.RT = TR;
    jobs{1}.stats{1}.fmri_spec.timing.fmri_t = 16;
    jobs{1}.stats{1}.fmri_spec.timing.fmri_t0 = 1;

    %Create multiple conditions .mat file for each run
    for runIdx = 1:size(runs, 1)
            nameList = unique(T{2});
            names = nameList';
            onsets = cell(1, size(nameList,1));
            durations = cell(1, size(nameList,1));
            sizeOnsets = size(T{3}, 1);
        for nameIdx = 1:size(nameList,1)
            for idx = 1:sizeOnsets
                if isequal(T{2}{idx}, nameList{nameIdx}) && T{1}(idx) == runIdx
                    onsets{nameIdx} = double([onsets{nameIdx} T{3}(idx)]);
                    durations{nameIdx} = double([durations{nameIdx} T{4}(idx)]);
                end
            end
            onsets{nameIdx} = (onsets{nameIdx} - (TR*disacqs)); %Adjust timing for discarded acquisitions
        end

        save ([matPrefix '_' num2str(subject) '_' num2str(runIdx)], 'names', 'onsets', 'durations')
        
        %Grab frames for each run using spm_select, and fill in session
        %information within jobs structure
        files = spm_select('ExtFPList', [rootDir num2str(subject) dataDir], ['^' funcs{runIdx}], 1:numScans);

        jobs{1}.stats{1}.fmri_spec.sess(runIdx).scans = cellstr(files);
        jobs{1}.stats{1}.fmri_spec.sess(runIdx).cond = struct('name', {}, 'onset', {}, 'duration', {}, 'tmod', {}, 'pmod', {});
        jobs{1}.stats{1}.fmri_spec.sess(runIdx).multi = cellstr([outputDir matPrefix '_' num2str(subject) '_' num2str(runIdx) '.mat']);
        jobs{1}.stats{1}.fmri_spec.sess(runIdx).regress = struct('name', {}, 'val', {});
        jobs{1}.stats{1}.fmri_spec.sess(runIdx).multi_reg = {''};
        jobs{1}.stats{1}.fmri_spec.sess(runIdx).hpf = 128;
        
                
    end
    
    movefile([matPrefix '_' num2str(subject) '_*.mat'], outputDir)
    
    %Fill in the rest of the jobs fields
    jobs{1}.stats{1}.fmri_spec.fact = struct('name', {}, 'levels', {});
    jobs{1}.stats{1}.fmri_spec.bases.hrf = struct('derivs', [0 0]);
    jobs{1}.stats{1}.fmri_spec.volt = 1;
    jobs{1}.stats{1}.fmri_spec.global = 'None';
    jobs{1}.stats{1}.fmri_spec.mask = {''};
    jobs{1}.stats{1}.fmri_spec.cvi = 'AR(1)';
    
    %Navigate to output directory, specify and estimate GLM
    cd(outputDir);
    spm_jobman('run', jobs)
    
    if ESTIMATE_GLM == 1
        load SPM;
        spm_spm(SPM);
    end
    
end

%Uncomment the following line if you want to debug
%keyboard



SPM Jobman


Now that we have created our own .mat files from the SPM GUI and seen how it can be written to the disk, altered, and reloaded back into SPM, the hour is at hand for using the command spm_jobman. This is a command for those eager to disenthrall themselves from the tyranny of graphical interfaces through batching SPM processes from the command line.

I first met spm_jobman - also known as Tim - a few weeks ago at a conference, when I was at the nadir of my sorrows, despairing over whether I would ever be able to run SPM commands without the GUI. Suddenly, like a judge divinely sent in answer to the lamentations of the oppressed, spm_jobman appeared by my side, trig and smartly dressed, and said he would be more than happy to help out; and from my first impression of his bearing and demeanor, I believed I was in the presence of an able and reliable ally. Anyone who has ever met spm_jobman, I believe, has felt the same thing. However, as I learned too late, far from being a delight, he is a charmless psychopath; and he continues to infect my dreams with nameless horrors and the unrelenting screams of the abattoir.

spm_jobman has three main options to choose from: Interactive, serial, and run. After choosing one of these options, for the second argument you enter your jobs structure, which is automatically populated after loading the .mat file from the command line. Interactive will load the traditional GUI with the options filled in from the jobs structure, which you can then modify and execute as you please; Serial will prompt the user to fill in each field, with the defaults set to the values in the jobs structure; and Run will execute the jobs structure without cuing the GUI. For most purposes, if you decide to run spm_jobman at all, you will want to use the Run command, as this allows you to loop processes over subjects without pause, allowing you to do more useful tasks, such as Googling the history of the lint roller.

Saving .mat files from SPM is immensely helpful in understanding the relationship between the .mat files created by SPM, and what exactly goes into them; and this will in turn reinforce your understanding of and ability to manipulate Matlab structures. The following tutorials show how the .mat file is generated from the SPM interface, which can then be used as a template for spm_jobman. I've been working with SPM for years now, but found out about this only recently; and I hope that it helps ease the burden of your SPM endeavors.