Advertisement

LaTeX troubles

Started by February 18, 2010 03:31 AM
5 comments, last by MDI 14 years, 8 months ago
Hi, LaTeX allows making a document with style and contents separated and so on, which is a beautiful idea and concept. The LaTeX syntax is also fine to me. No problems there. But there's some things about LaTeX that have always disturbed me. Why do people keep using this old monster and why are there no modern alternatives that allow the same concept but much more well-executed than LaTeX does? Here's what I dislike about LaTeX... 1) You need to run the compiler twice before the document is really ready. This because the LaTeX process apparently has two build steps (probably the second is where it has all the reference numbers for table of contents and such). I simply don't understand why I, the user of LaTeX, needs to know and care about that. When compiling C++, it has MORE than two steps. Preprocessing, compiling, linking... But do I need to run the C++ compiler 3 times? No, just once and it does everything. Why can't LaTeX not do that too. If it's for performance: I know TeX is from the 70's, but we're not using computers from the 70's today... And if the command line way to run latex compilers requires a few steps, ok fine, but even graphical TeX editors like Kile actually require you to press the compile button twice before your document looks ok! 2) You need to have lots, and lots, and lots, of modules installed to be able to compile documents. And these modules are scattered, all have their own installation procedures, etc... Basically, you can't just install texlive and be ready to compile relevant .tex files. There will be many things that don't work unless you install all the correct modules. I know that with C++ there is a similar idea, namely libraries and header files on which projects depend, but somehow this seems more clear and consistent with C++. Also on Windows a while ago I remember having had two latex compilers installed, where one gave errors about .jpg files, the other gave errors about .png files, so I had to make the choice of which image type I'd go use in my documents. Come on, images aren't even in the core standard of .tex?? 3) (at least on Linux) you actually need to modify your own linux distro to be able to use LaTeX! It needs various files in various places of your system, fonts installed in the correct dir, it needs PATH settings in your .profile, and so on. The base install of texlive with the package manager already does such things, but that is NOT enough to be able to compile must actual existing .tex documents out there! The fact that ubuntu offers texlive 2007 and not 2009, while recent .tex files need the latter, obviously doesn't help. So basically, what I'm missing in LaTeX, is a single good standard, which defines how to convert .tex code to a graphical document, and which define how modules/plugins/libraries/... work. And then see various compilers that adhere to this standard and can independently do the work of converting valid .tex code to such a document in a, to the user, single straightforward step. Has anyone else had good or bad experiences with LaTeX? Have you ever had troubles compiling a .tex document and needed to perform long lists of manual installation steps before it worked? Or does everything related to LaTeX somehow works smoothly for you?
I personally used LaTeX up to April 2008, when I finally stepped over to Mac. Our company product still uses LaTeX for type setting, though (it has nothing to do with games, just for the case you wonder).

The linux running LaTeX was ever SuSE, and the integration was almost fine. The various source directories are examined and their file occurrences are somewhat cached when invoking texconfig the right way. After that even our own style files and fonts are smoothly integrated. So I always had at most few troubles.
Advertisement
Quote: Original post by Lode
Hi,

LaTeX allows making a document with style and contents separated and so on, which is a beautiful idea and concept. The LaTeX syntax is also fine to me. No problems there.

But there's some things about LaTeX that have always disturbed me. Why do people keep using this old monster and why are there no modern alternatives that allow the same concept but much more well-executed than LaTeX does?

I use it because it works wonderfully, and the result it produces is the same, for quite small effort. It is indeed old, and it shows. But I have not seen anything else that has the same quality vs. cost ratio... for me, personally.

You also have to keep in mind that TeX by itself will not change. It is set, and any major bugs are most likely features by now. TeX is pretty much defined by what it is, not what it is supposed to be.

Quote: Original post by Lode
Here's what I dislike about LaTeX...

1) You need to run the compiler twice before the document is really ready. This because the LaTeX process apparently has two build steps (probably the second is where it has all the reference numbers for table of contents and such). I simply don't understand why I, the user of LaTeX, needs to know and care about that. When compiling C++, it has MORE than two steps. Preprocessing, compiling, linking... But do I need to run the C++ compiler 3 times? No, just once and it does everything. Why can't LaTeX not do that too. If it's for performance: I know TeX is from the 70's, but we're not using computers from the 70's today... And if the command line way to run latex compilers requires a few steps, ok fine, but even graphical TeX editors like Kile actually require you to press the compile button twice before your document looks ok!

This is how LaTeX works. It compiles in single-passes only. Twice is usually needed to get document-local references correct; first pass to write them to the auxiliary file, and second pass to read them from the auxiliary file. You even need a third pass when you have an external bibliography program like BibTeX, and this pass needs to be executed between passes, so automatic passes by TeX is not feasible. And separate bibliographies (the bibunits-package for example) even requires an extra passes for the external programs per bibliography as well. So many passes, some of which the TeX-compiler is not even aware about, are required.

I use WinEdt, and it performs all passes as needed (a second pass if the references changed, a third pass if the bibliography changes, for example), at least for the basic set of references. TeXnicCenter also does that, if I remember correct. But do you really need to perform all passes every time? During documentation development, does it really matter if some references are not set properly everywhere? Even if you do it manually, a makefile, or just a shell script, can do it all for you.



Quote: Original post by Lode
2) You need to have lots, and lots, and lots, of modules installed to be able to compile documents. And these modules are scattered, all have their own installation procedures, etc... Basically, you can't just install texlive and be ready to compile relevant .tex files. There will be many things that don't work unless you install all the correct modules. I know that with C++ there is a similar idea, namely libraries and header files on which projects depend, but somehow this seems more clear and consistent with C++. Also on Windows a while ago I remember having had two latex compilers installed, where one gave errors about .jpg files, the other gave errors about .png files, so I had to make the choice of which image type I'd go use in my documents. Come on, images aren't even in the core standard of .tex??

The packages I've needed can all be found in the package managers (both MikTeX's and TeX Live's), so they are just a click away. MikTeX even have an automatic package manager which installs missing ones when you compile a document which requires a missing package. Some simple packages are needed some times (for example, IEEE's publication styles or specific conference styles), but they are just a single file you can put in the document directory and you're set.

No, as far as I know, there is no bitmapped graphics support in plain TeX. I believe it's up to the individual drivers (the pdf- or dvi-compiler for example) to support importing external graphics of any kind. I use vector graphics for as much as I can (tip: PGF/TikZ is just wonderful), but when needed, I haven't had any problems with neither JPG nor PNG in any distribution I've used.

Quote: Original post by Lode
3) (at least on Linux) you actually need to modify your own linux distro to be able to use LaTeX! It needs various files in various places of your system, fonts installed in the correct dir, it needs PATH settings in your .profile, and so on. The base install of texlive with the package manager already does such things, but that is NOT enough to be able to compile must actual existing .tex documents out there! The fact that ubuntu offers texlive 2007 and not 2009, while recent .tex files need the latter, obviously doesn't help.

Quite opposite to my experience. Both MikTeX and TeX Live have provided what I've needed. Only worked with Windows though, so cannot comment on other systems. However, I have a hard time seeing any reason why, say, TeX Live's Windows and Linux distribution would be different in any significant part.

Quote: Original post by Lode
So basically, what I'm missing in LaTeX, is a single good standard, which defines how to convert .tex code to a graphical document, and which define how modules/plugins/libraries/... work. And then see various compilers that adhere to this standard and can independently do the work of converting valid .tex code to such a document in a, to the user, single straightforward step.

Has anyone else had good or bad experiences with LaTeX? Have you ever had troubles compiling a .tex document and needed to perform long lists of manual installation steps before it worked? Or does everything related to LaTeX somehow works smoothly for you?

I cannot say I have experienced any of your issues in any significant way. Yes, I have manually installed some some libraries (publication styles, as mentioned above), but a single file in the document directory is minimal work. Yes, it's old, and the work flow is inconvenient at times, but the quality is just too good to pass on.

I'm not trying to say you're wrong, because your experience is your experience. Your question was, basically, if someone else had experienced this, so that was just what I commented on; no, I haven't... in any significant way.
"It is set, and any major bugs are most likely features by now."

It doesn't have bugs -- Knuth has money riding on that. He actually pays people who find a bug; and last did so some years ago.

"1) You need to run the compiler twice before the document is really ready."

Yes. Engines like Word get around this problem by, basically, not always being right. Word's kerning is also absolutely rot.

"2) You need to have lots, and lots, and lots, of modules installed to be able to compile documents. "

You could just install Memoir. It's the nuclear weapon of LaTeX page design. Once you've installed that, you don't need anything else.

"3) (at least on Linux) you actually need to modify your own linux distro to be able to use LaTeX!"

OK, the problem here is that LaTeX isn't a system. LaTeX is like Linux. Very few people use raw Linux. Almost everyone uses a cooked version -- Ubuntu, Debian, Suse etc. Likewise LaTeX is rather raw. What you want is a distribution. Such as "tetex" or the aforementioned packaging systems.

Have you tried out LyX? It's a wysiwyg editor which uses LaTeX as the layout system. So you get all the advantages of having a proper typesetting engine in the background without the aggro of having to learn all the \{}[]{}[{}] isms, handles all the rebuilding for you and comes with loads of useful LaTeX stuff built in already.

You need ConTeXt!

I resisted this for some time when deciding on a typesetting tex package (mostly because it's not as big of a name), but after i got warmed up to it, i realized ConTeXt has an advantage over LaTeX in virtually every way:

- Actively maintained by a commercial company

- Monolithic design means you get all your goodies in one place. No more hunting down modules.

- Less workarounds and module conflicts. It just works like it's supposed to, i swear!

- It still takes several passes to put together a document, but it does it for you in one operation. Bad news: it's a bit slower than LaTeX, if speed is any concern.

- Awesome documentation. Too much, in fact. Bad news: it's all in PDF (of course)

- It does more than LaTeX does (from what i can tell). Just read the docs themselves for examples of what it can do.

- Isn't "Mathematics Thesis Paper" -oriented. You know what i mean. It's geared towards the broader concept of laying out a page, regardless of content or purpose.


I dare you to try it. Once you get over the fact that it's not LaTeX and uses different commands, it's pretty cool.
Quote: Original post by Lode
1) You need to run the compiler twice before the document is really ready.


Try rubber; it was created to address this issue.

Quote:
2) You need to have lots, and lots, and lots, of modules installed to be able to compile documents.

3) (at least on Linux) you actually need to modify your own linux distro to be able to use LaTeX!


I'm curious what you do that doesn't work with Ubuntu's packages. I installed texlive-full 2007 on Kubuntu Karmic and haven't run into anything that was too modern or too exotic to compile. The packager didn't require me to do anything more than tell it to install the package, and it brought in the packages (like texlive-publishers) which included things (like revtex) I thought I'd have to get myself.

If you really need texlive 2009, there's a PPA that looks promising.

Quote: Original post by Katie
Have you tried out LyX? It's a wysiwyg editor which uses LaTeX as the layout system. So you get all the advantages of having a proper typesetting engine in the background without the aggro of having to learn all the \{}[]{}[{}] isms, handles all the rebuilding for you and comes with loads of useful LaTeX stuff built in already.


I'll second LyX. With Beamer, it's pretty much replaced my use of Word/Writer and Powerpoint/Impress. It's not perfect, though. It lets you choose combinations of options that won't compile (in particular with the bibliography) and I had to insert some raw LaTeX to match the precise formatting rules for my thesis. The LaTeX it produces is fairly readable. I've produced documents in it and then exported them to LaTeX for those who don't use LyX.
Advertisement
I use LaTeX every day on Linux, and much of what you say is incorrect. If you've installed texlive using a decent package manager, like apt-get, you don't need to mess around with environment variables. Similarly, unless you're doing some really obscure stuff, texlive has literally everything that you need. Chances are, though, that you haven't got all of texlive installed. Have you got texlive-latex-extras?

Quote: Original post by Lode
LaTeX allows making a document with style and contents separated and so on, which is a beautiful idea and concept. The LaTeX syntax is also fine to me. No problems there.

But there's some things about LaTeX that have always disturbed me. Why do people keep using this old monster and why are there no modern alternatives that allow the same concept but much more well-executed than LaTeX does?


A bit of a coincidence: I've been thinking about trying to write a replacement for LaTeX for a while, as there are some things genuinely wrong with the system. However, the more you think about it, the more gargantuan the task appears. That's the reason.

This topic is closed to new replies.

Advertisement