I’ve been running a matlab script in a cluster and I got a warning saying that my MATLAB programm was comsuming in excess 500% CPU, so somehow the MATLAB access the head-node.
I’ve been submitting my jobs by running the command “sbatch myJob.batch”, where myJob.batch is the following script:
I would appreciate any help.
#!/bin/bash
#SBATCH -J 9h_11h_6g_c_windowing_blur_overload
#SBATCH -o %x.o%j # output file name
#SBATCH -e %x.e%j # error file name
#SBATCH -p jila
#SBATCH -q standard
#SBATCH -n 8 # number of cores
#SBATCH -N 1 # number of nodes (use 1)
#SBATCH --mem=30G # allocated memory
#SBATCH -t 07-00:00:00 # allocated wall time
#SBATCH --tmp=1GB # allocated disk space
#SBATCH --mail-type=BEGIN,END # notification type
#SBATCH [email protected]
# load matlab module
module load matlab
# run matlab file
matlab -nodisplay -nodesktop -r "PulseReconstruction; exit;"`
I'm suspecting that the problem is on using parpool in MATLAB, which somehow might access the command-line on the cluster.
This is how I'm requesting parallel pool on matlab:
`if isempty(gcp('nocreate'))
parpool;
end
disp(gcp('nocreate'));`
And this is my output from it:
`Starting parallel pool (parpool) using the 'Processes' profile ...
Connected to parallel pool with 8 workers.
ProcessPool with properties:
s
Connected: true
NumWorkers: 8
Busy: false
Cluster: Processes (Local Cluster)
AttachedFiles: {}
AutoAddClientPath: true
FileStore: [1x1 parallel.FileStore]
ValueStore: [1x1 parallel.ValueStore]
IdleTimeout: 30 minutes (30 minutes remaining)
SpmdEnabled: true
New contributor
Murilo Tibana is a new contributor to this site. Take care in asking for clarification, commenting, and answering.
Check out our Code of Conduct.