summaryrefslogtreecommitdiff
path: root/Master/texmf-dist/doc/latex/pgfplots/pgfplots.resources.tex
blob: 194e82d6aa42df18f4a61da0eefcf77c5cd51c11 (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
\section{Memory and Speed considerations}
\subsection{Memory Limits of \TeX}
\label{sec:pgfplots:optimization}
\PGFPlots\ can typeset plots with several thousand points if memory limits of \TeX\ are configured properly. Its runtime is roughly proportional to the number of input points\footnote{In fact, the runtime is pseudo--linear: starting with about $100{,}000$ points, it will become quadratic. This limitation applies to the path length of \PGF\ paths as well. Furthermore, the linear runtime is not possible yet for stacked plots.}.

\pgfplotsexpensiveexample
\begin{codeexample}[]
\begin{tikzpicture}
\begin{axis}[
	enlargelimits=0.01,
	title style={yshift=5pt},
	title=Scatter plot with $2250$ points]
	
\addplot[blue,
	mark=*,only marks,mark options={scale=0.3}]
	file[skip first]
	{plotdata/pgfplots_scatterdata3.dat};
	
\end{axis}
\end{tikzpicture}
\end{codeexample}

\pgfplotsexpensiveexample
\begin{codeexample}[]
\begin{tikzpicture}
\begin{axis}[
	enlarge x limits=0.03,
	title=Ornstein-Uhlenbeck sample
		($13000$ time steps),
	xlabel=$t$]
	
\addplot[blue] file {plotdata/ou.dat};
\end{axis}
\end{tikzpicture}
\end{codeexample}

\PGFPlots\ relies completely on \TeX\ to do all typesetting. It uses the front-end-layer and basic layer of \PGF\ to perform all drawing operations. For complicated plots, this may take some time, and you may want to read section~\ref{sec:pgfplots:importexport} for how to write single figures to external graphics files. Externalization is the best way to reduce typesetting time.

However, for large scale plots with a lot of points, limitations of \TeX's capacities are reached easily.

\subsection{Memory Limitations}
The default settings of most \TeX-distributions are quite restrictive, so it may be necessary to adjust them. 

Usually, the log--file or the final error message contains a summary about the used ressources, giving a hint which parameter needs to be increased.

\subsubsection{Mik\TeX}
For Mik\TeX, memory limits can be increased in two ways. The first is to use command line switches:
\begin{codeexample}[code only]
pdflatex 
	--stack-size=n --save-size=n 
	--main-memory=n --extra-mem-top=n --extra-mem-bot=n
	--pool-size=n --max-strings=n 
\end{codeexample}
\noindent Experiment with these settings if Mik\TeX\ runs out of memory. Usually, one doesn't invoke |pdflatex| manually: there is a development aid which does all the invocations, so this one needs to be adjusted. 

Sometimes it might be better to adjust the Mik\TeX\ configuration file permanently, for example to avoid reconfiguring the \TeX\ development program. This can be realized using the command
\begin{codeexample}[code only]
initexmf --edit-config-file=pdflatex
\end{codeexample}
\noindent which can be typed either on a command prompt in Windows or using Start $\gg$ Execute. As a result, an editor will be opened with the correct config file. A sample config file could be
\begin{codeexample}[code only]
main_memory=90000000
save_size=80000
\end{codeexample}
or any of the config file entries which are listed below can be entered. 
Thanks to ``LeSpocky'' for his documentation in

\url{http://blog.antiblau.de/2009/04/21/speicherlimits-von-miktex-erhoehen}.

\subsubsection{\TeX Live or similar installations}
For Unix installations, one needs to adjust config files. This can be done as follows:
\begin{enumerate}
	\item Locate |texmf.cnf| on your system. On my Ubuntu installation, it is in 
	
	|/usr/share/texmf/web2c/texmf.cnf|.
	\item Either change |texmf.cnf| directly, or copy it to some convenient place. If you copy it, here is how to proceed:
		\begin{itemize}
			\item keep only the changed entries in your local copy to reduce conflicts. \TeX\ will always read \emph{all} config files found in its search path.
			\item Adjust the search path to find your local copy. This can be done using the environment variable |TEXMFCNF|. Assuming your local copy is in |~/texmf/mytexcnf/texmf.cnf|, you can write
\begin{codeexample}[code only]
export TEXMFCNF=~/texmf/mytexcnf:
\end{codeexample}
			to search first in your directory, then in all other system directories.
		\end{itemize}
	\item You should change the entries
\begin{codeexample}[code only]
main_memory = n
extra_mem_top = n
extra_mem_bot = n
max_strings = n
param_size = n
save_size = n
stack_size = n
\end{codeexample}
		The log--file usually contains information about the parameter which needs to be enlarged.
\end{enumerate}
An example of this config file thing is shown below. It changes memory limits.
\begin{enumerate}
	\item Create the file |~/texmf/mytexcnf/texmf.cnf| (and possibly the paths as well).
\begin{codeexample}[code only]
% newly created file ~/texmf/mytexcnf/texmf.cnf:
% If you want to change some of these sizes only for a certain TeX
% variant, the usual dot notation works, e.g.,
% main_memory.hugetex = 20000000
main_memory = 230000000 % words of inimemory available; also applies to inimf&mp
extra_mem_top = 10000000     % extra high memory for chars, tokens, etc.
extra_mem_bot = 10000000     % extra low memory for boxes, glue, breakpoints, etc.
save_size = 150000	% for saving values outside current group
stack_size = 150000	% simultaneous input sources

% Max number of characters in all strings, including all error messages,
% help texts, font names, control sequences.  These values apply to TeX and MP.
%pool_size = 1250000
% Minimum pool space after TeX/MP's own strings; must be at least
% 25000 less than pool_size, but doesn't need to be nearly that large.
%string_vacancies = 90000
% Maximum number of strings.
%max_strings = 100000
% min pool space left after loading .fmt
%pool_free = 47500
\end{codeexample}
	\item Run |texhash| such that \TeX\ updates its |~/texmf/ls-R| database.
	\item Create the environment variable |TEXMFCNF| and assign the value `|~/texmf/mytexcnf:|' (including the trailing `|:|'!). For my linux system, this can be done using by adding
\begin{codeexample}[code only]
export TEXMFCNF=~/texmf/mytexcnf:
\end{codeexample}
	to |~/.bashrc|.
\end{enumerate}

Unfortunately, \TeX\ does not allow arbitrary memory limits, there is an upper bound hard coded in the executables.

\subsection{Reducing Typesetting Time}
\PGFPlots\ does a lot of computations ranging from abstract coordinate computations to low level |.pdf| drawing commands (realized by \PGF). For complex plots, this may take a considerable time -- especially for 3D plots.

One possibility to reduce typesetting time is to tell \PGF\ to generate single, temporary |.pdf| (or |.eps|) documents for a subset (or all) graphics in one run and re-use these temporary images in successive runs. For \PGFPlots, this is the most effective way to reduce typesetting time. It can be accomplished using the |external| library described in section~\ref{sec:pgfplots:export}.