Python Jupyter Notebook(仅限)内存错误,相同的代码在传统的 .py 中运行并有效
声明:本页面是StackOverFlow热门问题的中英对照翻译,遵循CC BY-SA 4.0协议,如果您需要使用它,必须同样遵循CC BY-SA许可,注明原文地址和作者信息,同时你必须将它归于原作者(不是我):StackOverFlow
原文地址: http://stackoverflow.com/questions/43866413/
Warning: these are provided under cc-by-sa 4.0 license. You are free to use/share it, But you must attribute it to the original authors (not me):
StackOverFlow
Jupyter Notebook (only) Memory Error, same code run in a conventional .py and works
提问by Danfoa
I have an assignment for a Deep Learning class, and they provide a Jupyter notebook as a base code, the thing is that after running the data import and reshape, jupyter notebook through a "Memory Error", after some analysis y tried to compile the same code in a normal .py file, and everything runs well.
我有一个深度学习课程的作业,他们提供了一个 Jupyter notebook 作为基本代码,问题是在运行数据导入和重塑后,jupyter notebook 通过“内存错误”,经过一些分析后尝试编译普通 .py 文件中的相同代码,一切运行良好。
The thing is that I'm required (preferably) to use the Jupyter notebook as the base for development, since is more interactive for the kind of task.
问题是我需要(最好)使用 Jupyter Notebook 作为开发的基础,因为它对于此类任务更具交互性。
<ipython-input-2-846f80a40ce2> in <module>()
2 # Load the raw CIFAR-10 data
3 cifar10_dir = 'datasets\'
----> 4 X, y = load_CIFAR10(cifar10_dir)
C:\path\data_utils.pyc in load_CIFAR10(ROOT)
18 f = os.path.join(ROOT, 'cifar10_train.p')
19 print('Path: ' + f );
---> 20 Xtr, Ytr = load_CIFAR_batch(f)
21 return Xtr, Ytr
22
C:\path\data_utils.pyc in load_CIFAR_batch(filename)
10 X = np.array(datadict['data'])
11 Y = np.array(datadict['labels'])
---> 12 X = X.reshape(-1, 3, 32, 32).transpose(0,2,3,1).astype("float")
13 return X, Y
14
MemoryError:
The error occurs in the line 12, i know is a memory consuming assignment, but that doesn't mean that 4 GB of RAM wont suffice, and that was confirmed when the code run without problems outside Jupyter.
错误发生在第 12 行,我知道这是一个内存消耗分配,但这并不意味着 4 GB 的 RAM 就足够了,当代码在 Jupyter 之外运行没有问题时,这一点得到了确认。
My Guess is it has something to do with the memory limit either by Jupyter or by Chrome, but I'm not sure and also dont know how to solve it.
我的猜测是它与 Jupyter 或 Chrome 的内存限制有关,但我不确定,也不知道如何解决它。
By the way:
顺便一提:
- I have a Windows 10 laptop with 4GB of RAM
- and Chrome Version 57.0.2987.133 (64-bit)
- 我有一台带有 4GB RAM 的 Windows 10 笔记本电脑
- 和 Chrome 版本 57.0.2987.133(64 位)
回答by Andriy Stolyar
Try running with Administrator privileges. Worked for me.
尝试以管理员权限运行。为我工作。
回答by Lefty G Balogh
I am only a year and 2 months late to this question. The technical answer as to why is really nicely explained here: https://superuser.com/questions/372881/is-there-a-technical-reason-why-32-bit-windows-is-limited-to-4gb-of-ram
我对这个问题只晚了一年零两个月。关于原因的技术答案在这里得到了很好的解释:https: //superuser.com/questions/372881/is-there-a-technical-reason-why-32-bit-windows-is-limited-to-4gb-内存
It also implies why the conda solution works.
这也暗示了为什么 conda 解决方案有效。
But for a lazy engineer's no-change workaround, close the Chrome tabs not absolutely necessary and restart your kernel so it starts afresh.
但是对于懒惰的工程师的无更改解决方法,关闭并非绝对必要的 Chrome 选项卡并重新启动内核,使其重新启动。
Kernel > Restart (& Run All)
回答by Delsilon
Similar thing happened with me while loading .npy file. Freeing up RAM solved the issue. It didn't have enough memory to load file into variables. Actually, both firefox and chrome was running on my system and closing firefox solved the problem.
我在加载 .npy 文件时发生了类似的事情。释放内存解决了这个问题。它没有足够的内存将文件加载到变量中。实际上,firefox 和 chrome 都在我的系统上运行,关闭 Firefox 解决了这个问题。
Useful Commands:free -h
Note of precaution: before interpreting this command on your own. Its highly recommended to go through this page: https://www.linuxatemyram.com/.
有用的命令:free -h
注意事项:在你自己解释这个命令之前。强烈建议浏览此页面:https: //www.linuxatemyram.com/。
回答by Danfoa
Apparently this happens when the python installation is not the best.
显然,当 python 安装不是最好的时候会发生这种情况。
As a matter of fact before solving the problem, I had installed on windows manually python 2.7 and the packages that I needed, after messing almost two days trying to figure out what was the problem, I reinstalled everything with Conda and the problem was solved.
事实上,在解决问题之前,我已经在 Windows 上手动安装了 python 2.7 和我需要的包,在搞了将近两天试图找出问题所在之后,我用 Conda 重新安装了所有东西,问题就解决了。
I guess Conda is installing better memory management packages and that was the main reason.
我猜 Conda 正在安装更好的内存管理包,这是主要原因。
回答by Nabeel Mughal
You can either reduce your dataset for training and testing this can solve your memory error problem.
您可以减少用于训练和测试的数据集,这可以解决您的内存错误问题。