summaryrefslogtreecommitdiff
path: root/usergrps/uktug/baskervi/4_1/yannis.tex
blob: 8b4db4997a4081f8d63858a92b7c562097ac762f (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
\title{Further thoughts on virtual fonts \ldots}
\author[Yannis Haralambous]{Yannis Haralambous\\
\texttt{Yannis.Haralambous@univ-lille1.fr}}
\begin{Article}

In a paper I published in 1993 (``Virtual fonts: great fun, not for
grand wizards only!'') I have already
addressed many of Berthold Horn's arguments; nevertheless I would
like to take the opportunity to respond a little further.

The basic argument of Horn is that PostScript drivers can reencode
fonts, so that virtual fonts are unnecessary for plain reencoding.
This is certainly true, but unfortunately \emph{only in a very limited
scope}.

% Fonts originally created to be used with \TeX\ (both CM, DC
%and most of the other fonts) use the first~32 positions of the
%table to represent glyphs, while PostScript and TrueType need part
%of (or all of) these positions as control codes.  A reencoded
%PostScript (or TrueType) font is still a PostScript (or
%TrueType) font, and hence it \emph{cannot} use some (or all) of
%these~32 positions. Blue Sky Research has found a way out of this
%problem by re-encoding (internally) CM fonts so that the first~32
%positions are translated to the upper part of the 8-bit table.
%This works fine for~128-character fonts (CMR, CMMI, MSAM, Euler
%etc.) but \emph{not} for DC fonts, since these occupy
%\emph{already} all~256 positions.

%It follows that one will never be able to reencode a PostScript
%font so as to obtain a complete DC font. People will argue that
%many east-European characters are not available anyway in the
%fonts we might want to reencode. Nevertheless, virtual font
%techniques (such as those used by Alan Jeffrey in
%\texttt{fontinst.tex}, allow the creation of accented
%east-European characters, but composition of glyphs (glyphs for
%Eastern diacritics, like the h\'a\v cek or the ogonek, \emph{are}
%included in every font, one just has to use them to produce the
%characters, a thing plain reencoding cannot do)\footnote{Horn
% mentions utilities which can create \emph{new} PostScript fonts
% with new composed characters.  Alan Jeffrey uses \TeX\ to do
% this work, a much natural process than any utility. Furthemore,
% the copyright situation of such ``mutant'' PostScript fonts is
% unclear.}.
 To produce accented letters, many PostScript fonts contain
 ``composite character data'': these are just translation
 coordinates for character parts, which will be composed to
 produce the result. PostScript interpreters know about these
 characters and can automatically take care of characters defined
 in that way in the font. But these accented characters cover
 only the West European range (excluding of course Welsh and
 Maltese); a PostScript interpreter is not clever enough to
 define a new composite character, for example, a \.z as needed
 in Polish, or a \^w as needed in Welsh. Alan Jeffrey's utility
 can do this very easily; this is far more than just plain
 reencoding, but is everyday practice for virtual
 fonts.

 The reader may have little interest in exotic languages (like the
 Polish or Welsh in the previous paragraph); but DC fonts have
 additional features which are implemented in virtual Cork-like
 PostScript fonts by Alan Jeffrey's virtual font creation tool:
   \begin{itemize}
   \item certain characters may need special kerning (such as the
 little zero for the perthousand sign, the German single and
 double opening quotes etc.);
   \item some symbols (such as \verb*= =,\S,\pounds) may be
 missing; these can be taken from other fonts;
   \item the glyph `-' is used twice: once for the text dash, and
 once for the hyphenation dash (cf.~\cite{Ha1} why these are
 separate characters); I doubt that reencoding can assign the
 same glyph to two different positions (?);
   \item the uppercase version of \ss\ is made out of two `S'
 letters; this is too much to ask for a poor PostScript
 interpreter\ldots
   \item PostScript fonts can contain ligatures, but not
 \emph{smart} ligatures: if you want your Dutch `\"e' to
 become an `e' at the end of a line, you need a
 begin-of-word ligature, something trivial for \TeX.
   \end{itemize}

 Virtual fonts are one of the most important aspects of
 the \TeX\ system.  This is not just the case for exotic
 situations; I voluntarily do not speak of Arabic and other
 extremely important uses of virtual fonts in oriental
 languages; virtual fonts are important for all of us
 Occidental language writers.  A PostScript font has poor
 typographical properties (no smart ligatures, restricted
 character composition, since you have to remain in the same
 font and the same size, etc.); by the use of virtual fonts,
 \TeX's typographical possibilities can be added to the font:
 a virtual font structure makes a PostScript font richer.

\begin{quote}\emph{\ldots It is not necessary to use virtual fonts to reencode a
font\ldots}\end{quote}

  True. But we want more than just re-encoding: word processors like
  Word or WordPerfect simply reencode fonts; \TeX\ can do more out of
  a PostScript font, and the proof can be found in the virtual fonts
  made by Alan Jeffrey's utility.

% \begin{quote}\emph{\ldots Users of Y\&Y software use scalable outline fonts
% without VF}\end{quote}
% 
%  Rephrased: users of etc. etc. can use neither VF nor DC fonts (since
%   these are not yet scalable). Fortunately there are public domain
%   implementations of \TeX\ which give users these possibilities.
%   Anyway, I wouldn't like to make out of these notes the critique of
%   any software.
% 
Horn states that virtual fonts cannot make unencoded
characters accessible.  This is certainly true, and---as he
says---this issue is solely solved by reencoding of fonts. But it
is not an argument against the use of virtual fonts: one can
always reencode a font into some universal encoding, for example
\texttt{ISOLatin1}. The latter might be universal, but is still
not Cork. Some extra work must be done to make a Cork-like font
out of it, and this is best handled by a virtual font.

I agree that reencoding is the only way to make characters such as
the Thorn or Eth appear; but it should be only one step of the
printing progress, between others.

\begin{quote}\emph{\ldots Making a fake smallcaps font\ldots
A smallcaps font should have properly designed small caps
letters\ldots}\end{quote}

  Horn is \emph{absolutely} right when he says that one should rather
  adopt an `Expert' font than faking small caps by scaling regular
  caps. Now, suppose you buy that Expert font.  What's the next step?
  You will discover that Expert fonts do not contain uppercase letters
  (cf.~\cite{Post}, page~602).  Is there a possibility of merging the
  regular and expert fonts into what we expect to be small-caps font,
  using plain reencoding? I'm afraid not, since reencoding means
  ``assigning glyphs to positions, \emph{inside} a font'' and not
  ``\emph{between} fonts''; you will have to use virtual fonts. Alan
  Jeffrey's utility automatically finds out if there is an expert font
  and what characters it contains. It then either creates fake small
  caps, or takes (just) the real small caps from the expert font.
  Furthermore, there cannot possibly be any kerning pairs between small
  caps and uppercase letters in the PostScript fonts since these are
  not contained in the same 256-characters table. But the \TeX\ 
  virtual font can contain such kerning pairs (some of them, like
  \textsc{Ta} or \textsc{Va} being quite important); after a little
  experimenting the quality-conscious user will easily add the most
  important kerning pairs to the VPL file.

\begin{quote}\emph{\ldots Only \TeX\ knows anything about virtual fonts\ldots}
\end{quote}

Actually Horn is not saying ``do not use virtual fonts'', but ``do not
use PK fonts'', since ``they will never be able to enter into
illustrations''. He continues:

\begin{quote}\emph{\ldots Well, in the \TeX\ world we tend to be somewhat myopic.
    \ldots} \ldots \emph{\ldots Hence PK bitmapped fonts are not
    useful, one needs to use fonts in some established industry
    standard form such as Type~1 or TrueType\ldots}\end{quote}

  To my (myopic?) eyes the PostScript world seems much more myopic.
  For many years poor PostScript fonts have been designed; in the
  meantime the \TeX\ community kept saying ``fonts without metaness
  are anti-typographic'' but (apart from a few exceptions, like Jacques
  Andr\'e's papers on pointsize dependent PostScript font code,
  cf.~\cite{andre} and~\cite{andre-vatton}) metaness seemed to be
  tabou outside the \TeX\ world; suddenly two years ago the goddess
  Adobe declared that fonts without metaness are no good, and
  introduced a new object of veneration: Multiple Master fonts.
  These are extremely complex and memory consuming, but still much
  poorer than \MF\ created fonts; nevertheless (myopic) PostScript
  font users consider them as the \emph{non plus ultra}.

  \MF\ can do things PostScript cannot even dream of. Try to adjust
  gray density of Hindi, Arabic and Latin text on the same page with
  PostScript fonts. Horn says that \emph{scaled small caps are fake
    small caps}. I say: scaled fonts are always faked: \emph{all
    PostScript fonts are faked when used in a size different that
    their design size} (and most of the time we don't even know what
  that design size is; these are things the customer had better not find
  out\ldots).

  Erik-Jan Vens has developed a tool to convert PostScript fonts to
  \MF. This opens new horizons to digital typography, since we can
  manipulate these fonts using \MF\ tools. DVI drivers which do not
  read PK files will never take advantage of these methods
  (cf.~\cite{HarDar}).

\begin{quote}\emph{\ldots (note that virtually 
    all fonts commonly used with \TeX\ are now available in Type1 format,
    including CM, AMS etc.\ldots}\end{quote}

  But I would add the word `obsolete' after `CM': IMHO, CM fonts are
  \emph{just good enough to write English}. It is quite an irony that
  the text you are reading this very moment is written in
  English\footnote{Bien que j'aurais pu changer de langue \`a tout
    instant~; si j'\'ecris en anglais ce n'est pas pour la gloire de
    la langue mais pour faciliter la lecture au lecteur britannique.
    Passons\ldots }, but here in Europe hundreds of millions of people
  communicate through other languages, which cannot be hyphenated with
  CM fonts (cf.~\cite{HaTTN}).  Of course, nobody will ever force the
  only-English-writing-\TeX-user to use DC fonts, but can progress be
  stopped?

  Finally, Horn omits a very important issue: there is a tool called
  DVICopy (written by Peter Breitenlohner). Using DVICopy one can
  \emph{de-virtualize} a document, that is \emph{replace characters
    from a virtual font by the real character(s) they represent}.
  This eliminates all communication problems: suppose I have created a
  document using a PostScript font, which itself is encoded in some
  standard encoding.  For this I have used a virtual font, which my
  correspondant might not necessarily have
%  \begin{tiny}(in fact, if he/she uses certain software, he/she won't
%even able to use virtual fonts anyway, but that's another
%issue)\end{tiny}. 
By devirtualizing my DVI file, I obtain a new
DVI file which uses precisely and exclusively the real font on
which my virtual font was based. In the case of PostScript fonts,
this means that if my virtual fonts were constructed upon Adobe
Standard encoded PostScript fonts (that's the usual encoding for
PostScript fonts) a de-virtualized DVI file will contain
references to these original PostScript fonts only, which makes it
as portable as a DVI file can be.

I would like to close this paper by some general remarks on the
``\TeX\ world'', as I see it: I don't believe \TeX\ users are myopic
or isolated from the rest of the world. On the contrary, they see
problems that commercial programs can barely handle, and solve them
through \TeX\ without even making much noise about it. In the last few
years it has happened that there has been much more development in
public domain \TeX ware, than in commercial software around \TeX.
Important innovations have always appeared first in public domain
software\footnote{With a single exception: the user interface. Public
  domain software is never as user-friendly as commercial ones.}.
Many times commercial software has adopted those innovations; but
there are also many \TeX\ features yet undiscovered by the commercial
world---and many commercial products still at the stone age of \TeX.
This is a sad consequence of the fact that \TeX\ is a public domain
program, whose ``official'' development has stopped, and has been
unofficially taken over by mostly unorganized volunteers: this makes
both the charm and the pain of \TeX\ history. Virtual fonts may be one
of the innovations that all commercial products haven't adopted
yet---or maybe not; but we should think twice before giving away
virtual fonts in exchange for something poorer (PostScript font
reencoding), when we can equally well use both at the same time, and
produce even better results.

\begin{thebibliography}{666}
\bibitem{Post} Adobe Systems International, PostScript Language
Reference Manual, second edition, Addison Wesley, 1990.
\bibitem{andre} Jacques Andr\'e, `Adapting Character Shape to Point
Size', \emph{PostScript Review}, April 1991.  \bibitem{andre-vatton} Jacques
Andr\'e and Ir\`ene Vatton, `Contextual Typesetting of Mathematical
Symbols---Taking Care of Optical Scaling', submitted to \emph{Electronic
Publishing}, 1993.  \bibitem{HaTTN} Yannis Haralambous, `\TeX\ 
conventions concerning languages', \emph{\TeX\ and TUG News}, Volume
1, Number 4, 1992.  \bibitem{Ha1} Yannis Haralambous, `Virtual Fonts:
Great Fun, not for Wizards Only', \emph{Minutes and APendiceS} 93.1,
Nederlandstalige \TeX\ Gebruikersgroep, 1993.  \bibitem{HarDar} Yannis
Haralambous, `Parameterization of PostScript fonts through \MF\ --- an
alternative to Adobe's multiple-master fonts', to appear in \emph{Proceedings
of Raster Imaging and Digital Typography}, Darmstadt 1994.
\end{thebibliography}
\end{Article}
\endinput