TAGS :Viewed: 9 - Published at: a few seconds ago

[ python memory error when loading MNIST.pkl.gz ]

I am new to Python and I have downloaded the code DBN.py but there is a problem:when I was trying to load the dataset MNIST.pkl.gz.there is always an meomory error.. my code is very simple:

import cPickle, gzip, numpy
# Load the dataset
f = gzip.open('C:\Users\MAC\Desktop\mnist.pkl.gz', 'rb')
train_set, valid_set, test_set = cPickle.load(f)

and the error is as follows:

Traceback (most recent call last):

File "<ipython-input-17-528eea6bbfdd>", line 1, in <module>
runfile('C:/Users/MAC/Documents/Python Scripts/untitled0.py',  wdir='C:/Users/MAC/Documents/Python Scripts')

File "C:\Users\MAC\Anaconda\lib\site-packages\spyderlib\widgets\externalshell\sitecustomize.py", line 699, in runfile
execfile(filename, namespace)

File "C:\Users\MAC\Anaconda\lib\site-packages\spyderlib\widgets\externalshell\sitecustomize.py", line 74, in execfile
exec(compile(scripttext, filename, 'exec'), glob, loc)

File "C:/Users/MAC/Documents/Python Scripts/untitled0.py", line 19, in <module>
train_set, valid_set, test_set = cPickle.load(f)

File "C:\Users\MAC\Anaconda\lib\gzip.py", line 268, in read

File "C:\Users\MAC\Anaconda\lib\gzip.py", line 320, in _read
self._add_read_data( uncompress )

File "C:\Users\MAC\Anaconda\lib\gzip.py", line 338, in _add_read_data
self.extrabuf = self.extrabuf[offset:] + data


I really have no idea,is it because the memory of my computer is too small? it is on windows 7,32 bits

Answer 1

I suspect the problem to be Spyder in this case.
As to why, I have no idea but either the process isn't allowed to allocate enugh memory outside of it's own script or it simply gets stuck in a loop some how.

Try running your code without Spyder by pasting your code into myscript.py for instance and open a terminal and navigate to the folder where you saved your script and run python myscript.py and see if that works or gives the same output.

This is based on a conversation in the comments above.