\section{Memory and Speed considerations} \label{sec:pgfplots:optimization} \PGFPlots\ can typeset plots with several thousand points if memory limits of \TeX\ are configured properly. Its runtime is roughly proportional to the number of input points\footnote{In fact, the runtime is pseudo--linear: starting with about $100{,}000$ points, it will become quadratic. This limitation applies to the path length of \PGF\ paths as well. Furthermore, the linear runtime is not possible yet for stacked plots.}. \message{ATTENTION: I am now about to typeset really huge pictures. You will need to ENLARGE YOUR TeX MEMORY CAPACITIES if this fails...}% \begin{codeexample}[] \begin{tikzpicture} \begin{axis}[ enlargelimits=0.01, title style={yshift=5pt}, title=Scatter plot with $2250$ points] \addplot[blue, mark=*,only marks,mark options={scale=0.3}] file[skip first] {plotdata/pgfplots_scatterdata3.dat}; \end{axis} \end{tikzpicture} \end{codeexample} \begin{codeexample}[] \begin{tikzpicture} \begin{axis}[ enlarge x limits=0.03, title=Ornstein-Uhlenbeck sample ($13000$ time steps), xlabel=$t$] \addplot[blue] file{plotdata/ou.dat}; \end{axis} \end{tikzpicture} \end{codeexample} \message{ok, passed.}% \PGFPlots\ relies completely on \TeX\ to do all typesetting. It uses the front-end-layer and basic layer of \PGF\ to perform all drawing operations. For complicated plots, this may take some time, and you may want to read section~\ref{sec:pgfplots:importexport} for how to write single figures to external graphics files. Externalization is the best way to reduce typesetting time. However, for large scale plots with a lot of points, limitations of \TeX's capacities are reached easily. \subsection{Memory limitations} The default settings of most \TeX-distributions are quite restrictive, so it may be necessary to adjust them. For Mik\TeX, this can be done using simple command line switches: \begin{codeexample}[code only] pdflatex --stack-size=n --save-size=n --main-memory=n --extra-mem-top=n --extra-mem-bot=n --pool-size=n --max-strings=n \end{codeexample} \noindent Experiment with these settings if Mik\TeX\ runs out of memory. Usually, the log--file contains a summary about the used ressources, giving a hint which parameter needs to be increased. For Unix installations, one needs to adjust config files. This can be done as follows: \begin{enumerate} \item Locate |texmf.cnf| on your system. On my Ubuntu installation, it is in |/usr/share/texmf/web2c/texmf.cnf|. \item Either change |texmf.cnf| directly, or copy it to some convenient place. If you copy it, here is how to proceed: \begin{itemize} \item keep only the changed entries in your local copy to reduce conflicts. \TeX\ will always read \emph{all} config files found in its search path. \item Adjust the search path to find your local copy. This can be done using the environment variable |TEXMFCNF|. Assuming your local copy is in |~/texmf/mytexcnf/texmf.cnf|, you can write \begin{codeexample}[code only] export TEXMFCNF=~/texmf/mytexcnf: \end{codeexample} to search first in your directory, then in all other system directories. \end{itemize} \item You should change the entries \begin{codeexample}[code only] main_memory = n extra_mem_top = n extra_mem_bot = n max_strings = n param_size = n save_size = n stack_size = n \end{codeexample} The log--file usually contains information about the parameter which needs to be enlarged. \end{enumerate} Unfortunately, \TeX\ does not allow arbitrary memory limits, there is an upper bound hard coded in the executables.