I used LaTeX for approximately 10 years, for little things to relatively complex, including my bachelor’s and master’s theses. It never felt natural at reliable or consistent. Every customization required weird \makeatletter \makeatother hacks and was very brittle. Everything seemed more complicated than necessary and hard to grok, with weird interdependencies and interactions.
There are probably good reasons for all of that, but it is just both bad DX and bad UX. It feels like you need to be a hardcore LaTeX expert or consult with one, in order to accomplish the most mundane things. Especially in a reliable way, that won’t break upon making seemingly unrelated changes, or won’t break other things itself.
I used Typst for a few weeks. It already feels much more understandable, consistent, hackable, and customizable. I guess that is the difference between an ad hoc macro system and an actually thought through programming language.
The only drawback I can see is the ecosystem being smaller and less mature. That is, however, counteracted by being able to do things on your own, without immersing yourself deeply in LaTeX for years. Also, it will
improve with time.
LaTeX is great, don’t get me wrong. But its heritage and historical baggage is really dragging it down.
It's kinda fascinating how dominant LaTeX is, how nice its output is, how respected Knuth is as a computer scientist, and at the same time how totally awful it feels to use it. Hard to figure out how it can be so good and so bad at once.
In particular it's interesting how people seem to think TeX itself is actually quite nice to use but its popularity and LaTeX packages created a huge mess of a system.
Well -- TeX is "80s good". We've gotten better at designing ergonomic software since and it really doesn't meet the modern standard. But it's good enough for most people, and sufficiently hard to replace, that it has stuck around.
Added to that, academics specifically are more willing to suffer old crufty stuff than software engineers tend to be. After all their job is to absorb fields of material whether good or bad, and the technology tends to be lagging behind the bleeding edge in many subfields anyway so TeX doesn't even necessarily stand out.
part of the challenge is the inherent irreducible complexity of the domain. "Make text look good on page" leaves lots of details unspecified.
another part is many people built their own solution to their own corner of this domain, and not all of them had the deep appreciation for how the rest of the TeX system works.
I hear similar complaints about "Make web page look good", which is popular but also a huge mess of a system.
I’ve been using Typst for years now. Wrote my PhD thesis in it [1] as well as a book. Works great; can’t recommend it enough. I usually barely use plugins because it’s either already included or pretty easy to write a bit of code yourself
Didn't see a pdf of your thesis, except on your web-site[1]. But the version there (at least as it renders on my machine), has numerous formatting issues. For one egregious example, look at the letter spacing in the title and legend of Figure 2.2 (page 27):
"civilia ns", "Pe rs ona lity s core". I'm sure the content is great, but using it as an example of Typst prowess, seems ill-advised.
I don't see any issues with the title of Figure 2.2, but the legend and the x-axis label have weird letter spacing indeed. It seems like images like this are standalone (https://github.com/rikhuijzer/phd-thesis/blob/main/images/pe...) and probably aren't generated by Typst. So perhaps the weird spacing is not Typst's fault.
I took a look at the repo and it's probably the fault of the the SVG of the graphs, not of typist itself. Now, you could have used typst libraries to generate the graphs but back then (2 years ago I think?) it was probably a struggle.
Yea, I don’t see a point of criticizing minutiae from a thesis that has already been accepted, but I agree, the graphs look out of place and generally not in the same style of the other text. Also, I guess I am just really used to latex’s font, it just automatically gives an academic style that I do t get from this. Again, pure personal bias.
Not a latex post with someone talking about typst. Come back when the html output works. Not having good accessible output was more acceptable back when Tex was invented, it definitely isn’t now, and they made a new system and somehow got this worse then modern latex.
Not necessarily my experience. I wrote (and I am writing) several academic documents with it. There are its quirks, of course, but with good classes such as memoir, I don't feel the need to do a lot more than basic customization in the preamble. Still is a good tool for me.
It's worth noting that TeX was developed in the same time period that the details of lexical scope were being nailed down by Guy Steele in the Rabbit compiler for Scheme. It's not that TeX is an ad hoc system; it's more the case that people didn't actually know how to implement a better system at the time.
That's true. Do you know who else won a Turing Award? Tony Hoare.
What is Tony famous for? Well, lots of things, including his very important comparison sort algorithm Quicksort, but, in this context how about the Billion Dollar Mistake ? That's a pretty nasty booboo in many programming languages for which Tony blames himself because it was his idea.
Like your parent said, TeX shipped a long time ago and we learned a lot since then, it is not a surprise that we know how to do better today, in fact it would be a serious black mark for Computer Science if we couldn't.
I like LaTeX for the most part (I have had to use some weird hacks but usually once they are done they are stuck in a macro and turned ignorable).
But I think the main things it has going for it are that it: produces nice output, and all the journals accept it. Does there exist a tool that renders Typist to LaTeX? That could play nicely with the existing ecosystem.
Huh. My special lady friend is in the process of finishing up her thesis using LaTeX after ditching LibreOffice. It was nightmare for some of the same reasons: bad UX, bad portability and crippling bugs. There was a ramping up period, and she had an out of date GitHub repo to help her, but she is incredibly happy that she switched. Collaboration could be smoother I guess.
as long as the arXiv doesn't accept Typst, it's never going to be a real alternative to LaTeX. and the arXiv maintainers seem either hostile or indifferent to Typst
Nope. Typst's primary output is PDF, and it is a stand-alone binary. It's a replacement for most uses of LaTeX to produce documents. It is not a replacement for this project, which focuses only on rendering LaTeX math code and can be embedded in multiple different runtimes.
I know exactly what you mean, and that paired with a community that is absolutely sure that they know exactly how things need to be done and everyone that wants it in another way is dumb.
> I used Typst for a few weeks. It already feels much more understandable, consistent, hackable, and customizable. I guess that is the difference between an ad hoc macro system and an actually thought through programming language.
> The only drawback I can see is the ecosystem being smaller and less mature. That is, however, counteracted by being able to do things on your own, without immersing yourself deeply in LaTeX for years. Also, it will improve with time.
Aren't these 3 different implementations with totally different use cases? Katex is latex-like implementation for web. Ratex is really 'rewrite katex in rust'. I don't understand what is getting "bolted on" to what here.
Pixel-diff CI against a golden suite is the right discipline for layout
libraries, and it's noticeably rare in the JS ecosystem.
Most "matches KaTeX" claims I've seen in the wild rely on screenshot eyeballing, which collapses on edge cases like spacing primes, integral subscripts, and
matrix delimiters that scale.
One thing I'd be curious about: how are font fallbacks handled when the
same Rust core ships to platforms with different system font availability?
KaTeX bundles fonts and assumes they load cleanly; CoreGraphics and Skia
bring their own glyph caches and metrics.
Does the display list carry metric snapshots from the host text shaper, or does the core compute layout from a bundled metric file independent of the backend?
It's interesting to me that the page doesn't describe the size of the rust binary (relevant for mobile app use cases where you would need to add the Rust binary to your app) or performance.
The webpage also does read like it was at least heavily LLM assisted, which makes it a bit hard to trust it.
That all said, this is definitely something I'd be interested in using for Zulip if is indeed going to be a well maintained open source project.
(We currently have a node server component that the Zulip server runs only the render LaTeX).
I suppose somebody has to ask (and most likely complain) about this: what does RaTeX do for accessibility? I gather that it produces images. I checked the demo, and there's not even an alternative text or an ARIA label, so this seems completely invisible to anybody relying on a screen reader. This is quite a step backwards compared to modern LaTeX, which can now tag equations with MathML within the PDF, or using MathJax, or any other serious tool targeting HTML, like pandoc or LaTeXML.
Is accessibility anywhere on the roadmap for RaTeX?
AFAIK, KaTex has accessibility via MathML included, but it doesn't provide an export as image - it rather renders as html. So why would you want the utility that you use to render an additional image to take care of something that KaTeX already does?
The landing page (clearly to a large part written by LLM) does not mention that both KaTeX and mathjax can render to SVG in node. This wasm approach might still be lighter, but the advantage is not as clear as the page makes it seem. (It also contains LLM dishonesties like that the bundle size is 0 KB.)
> It also contains LLM dishonesties like that the bundle size is KB
That one jumped out to me too. The phrasing is so wiggly but technically correct it feels intentional. When I saw it I didn't blame it on the LLM, which is worse.
We recently switched from Node.js + Mathjax for rendering latex to Goja (https://github.com/dop251/goja) + Mathjax, and surprisingly it worked really well. We did this because the app is already 99% golang, and this allows us to eliminate the remaining non-go pieces, greatly simplifying the SBOM. And yes, we tried go-latex, but it's not nearly as feature complete as Mathjax. Not to mention using goja + Mathjax adds 10MB to binary size while Node.js adds 200MB+
I've discovered typst in the last year and used to build a resume and cover letter template that feeds from a YAML file.
After a bit of tinkering and understanding the idiosyncracies of Typst, the joy of having reliable, consistent, beautiful, data-driven resumes and cover letters is not measurable. It basically lifted any barrier to applications, while whatever I had before I had always considered a burden.
On top of that, I can add hiring process data directly to the yaml file to run further analysis.
Can LaTeX do this? Most probably, but the learning curve is the difference.
I have been using Typst for creating notes and it is an awesome tool. I use it to create notes on welding for my students. It makes my life so much easier compared to badsoft and its not-word-ing (you understand me).
I am so confused. There's already a native version of LaTeX... it's... it's LaTeX. Why would a Rust implementation need to match KaTeX instead of properly implementing a real (and modern, so unicode-out-of-the-box) TeX engine, that LaTeX (which is a set of convenience macros) then trivially runs on top of?
I used LaTeX for approximately 10 years, for little things to relatively complex, including my bachelor’s and master’s theses. It never felt natural at reliable or consistent. Every customization required weird \makeatletter \makeatother hacks and was very brittle. Everything seemed more complicated than necessary and hard to grok, with weird interdependencies and interactions.
There are probably good reasons for all of that, but it is just both bad DX and bad UX. It feels like you need to be a hardcore LaTeX expert or consult with one, in order to accomplish the most mundane things. Especially in a reliable way, that won’t break upon making seemingly unrelated changes, or won’t break other things itself.
I used Typst for a few weeks. It already feels much more understandable, consistent, hackable, and customizable. I guess that is the difference between an ad hoc macro system and an actually thought through programming language.
The only drawback I can see is the ecosystem being smaller and less mature. That is, however, counteracted by being able to do things on your own, without immersing yourself deeply in LaTeX for years. Also, it will improve with time.
LaTeX is great, don’t get me wrong. But its heritage and historical baggage is really dragging it down.
It's kinda fascinating how dominant LaTeX is, how nice its output is, how respected Knuth is as a computer scientist, and at the same time how totally awful it feels to use it. Hard to figure out how it can be so good and so bad at once.
Posts/discussion I found interesting:
- http://www.goodmath.org/blog/2008/01/10/the-genius-of-donald...
- https://tex.stackexchange.com/q/24671
- https://news.ycombinator.com/item?id=15733381
In particular it's interesting how people seem to think TeX itself is actually quite nice to use but its popularity and LaTeX packages created a huge mess of a system.
Well -- TeX is "80s good". We've gotten better at designing ergonomic software since and it really doesn't meet the modern standard. But it's good enough for most people, and sufficiently hard to replace, that it has stuck around.
Added to that, academics specifically are more willing to suffer old crufty stuff than software engineers tend to be. After all their job is to absorb fields of material whether good or bad, and the technology tends to be lagging behind the bleeding edge in many subfields anyway so TeX doesn't even necessarily stand out.
part of the challenge is the inherent irreducible complexity of the domain. "Make text look good on page" leaves lots of details unspecified.
another part is many people built their own solution to their own corner of this domain, and not all of them had the deep appreciation for how the rest of the TeX system works.
I hear similar complaints about "Make web page look good", which is popular but also a huge mess of a system.
I’ve been using Typst for years now. Wrote my PhD thesis in it [1] as well as a book. Works great; can’t recommend it enough. I usually barely use plugins because it’s either already included or pretty easy to write a bit of code yourself
[1]: https://github.com/rikhuijzer/phd-thesis
Didn't see a pdf of your thesis, except on your web-site[1]. But the version there (at least as it renders on my machine), has numerous formatting issues. For one egregious example, look at the letter spacing in the title and legend of Figure 2.2 (page 27): "civilia ns", "Pe rs ona lity s core". I'm sure the content is great, but using it as an example of Typst prowess, seems ill-advised.
[1] https://huijzer.xyz/files/f72fa09561f20162.pdf
I don't see any issues with the title of Figure 2.2, but the legend and the x-axis label have weird letter spacing indeed. It seems like images like this are standalone (https://github.com/rikhuijzer/phd-thesis/blob/main/images/pe...) and probably aren't generated by Typst. So perhaps the weird spacing is not Typst's fault.
I took a look at the repo and it's probably the fault of the the SVG of the graphs, not of typist itself. Now, you could have used typst libraries to generate the graphs but back then (2 years ago I think?) it was probably a struggle.
Yea, I don’t see a point of criticizing minutiae from a thesis that has already been accepted, but I agree, the graphs look out of place and generally not in the same style of the other text. Also, I guess I am just really used to latex’s font, it just automatically gives an academic style that I do t get from this. Again, pure personal bias.
How about adding a PDF release ;-)
Not a latex post with someone talking about typst. Come back when the html output works. Not having good accessible output was more acceptable back when Tex was invented, it definitely isn’t now, and they made a new system and somehow got this worse then modern latex.
Not necessarily my experience. I wrote (and I am writing) several academic documents with it. There are its quirks, of course, but with good classes such as memoir, I don't feel the need to do a lot more than basic customization in the preamble. Still is a good tool for me.
This mirrors my experience.
It's worth noting that TeX was developed in the same time period that the details of lexical scope were being nailed down by Guy Steele in the Rabbit compiler for Scheme. It's not that TeX is an ad hoc system; it's more the case that people didn't actually know how to implement a better system at the time.
'People' in this case were Don Knuth (TeX) and Leslie Lamport (LaTeX). Both are Turing Award winners.
That's true. Do you know who else won a Turing Award? Tony Hoare.
What is Tony famous for? Well, lots of things, including his very important comparison sort algorithm Quicksort, but, in this context how about the Billion Dollar Mistake ? That's a pretty nasty booboo in many programming languages for which Tony blames himself because it was his idea.
Like your parent said, TeX shipped a long time ago and we learned a lot since then, it is not a surprise that we know how to do better today, in fact it would be a serious black mark for Computer Science if we couldn't.
Which means what, exactly.
I like LaTeX for the most part (I have had to use some weird hacks but usually once they are done they are stuck in a macro and turned ignorable).
But I think the main things it has going for it are that it: produces nice output, and all the journals accept it. Does there exist a tool that renders Typist to LaTeX? That could play nicely with the existing ecosystem.
Huh. My special lady friend is in the process of finishing up her thesis using LaTeX after ditching LibreOffice. It was nightmare for some of the same reasons: bad UX, bad portability and crippling bugs. There was a ramping up period, and she had an out of date GitHub repo to help her, but she is incredibly happy that she switched. Collaboration could be smoother I guess.
Latex is used because writing math in latex is very good, and despite how everything else (like tables and figures) is so bad.
That's why people take the math subset of latex and use it in other contexts - exactly like this product.
as long as the arXiv doesn't accept Typst, it's never going to be a real alternative to LaTeX. and the arXiv maintainers seem either hostile or indifferent to Typst
+1 for Typst being amazing.
I can actually like write my own functions when I need to. I don't think I have ever written a LaTeX macro without having to look up a lot of stuff.
Yups, I love the idea of LaTeX, LaTeX itself not so much.
I hope Typst eventually gets some equivalent to tkz-euclide, as I've never seen anything even remotely comparable.
Is Typst appropriate for web apps; e.g., the input forms here?
Nope. Typst's primary output is PDF, and it is a stand-alone binary. It's a replacement for most uses of LaTeX to produce documents. It is not a replacement for this project, which focuses only on rendering LaTeX math code and can be embedded in multiple different runtimes.
> The only drawback I can see is the ecosystem being smaller and less mature.
This seems like the _perfect_ use for an LLM: systematically porting over as much of the "ecosystem" to Typst as possible. Is anyone doing that?
Two hours ago a coworker told me that he let an llm port his latex template to typst. According to him, it was perfect.
I know exactly what you mean, and that paired with a community that is absolutely sure that they know exactly how things need to be done and everyone that wants it in another way is dumb.
You want Typst: https://github.com/typst/typst
It's like the JSX of Latex: markup in a programming language, not a programming language pretends to be markup.
> I used Typst for a few weeks. It already feels much more understandable, consistent, hackable, and customizable. I guess that is the difference between an ad hoc macro system and an actually thought through programming language.
> The only drawback I can see is the ecosystem being smaller and less mature. That is, however, counteracted by being able to do things on your own, without immersing yourself deeply in LaTeX for years. Also, it will improve with time.
I don't know why but this chain amuses me: RaTeX -> KaTeX -> LaTeX.
I guess it shows how everyone loves but hates LaTeX and is always trying to bolt on that one last thing that will make it good.
Aren't these 3 different implementations with totally different use cases? Katex is latex-like implementation for web. Ratex is really 'rewrite katex in rust'. I don't understand what is getting "bolted on" to what here.
Pixel-diff CI against a golden suite is the right discipline for layout libraries, and it's noticeably rare in the JS ecosystem.
Most "matches KaTeX" claims I've seen in the wild rely on screenshot eyeballing, which collapses on edge cases like spacing primes, integral subscripts, and matrix delimiters that scale.
One thing I'd be curious about: how are font fallbacks handled when the same Rust core ships to platforms with different system font availability?
KaTeX bundles fonts and assumes they load cleanly; CoreGraphics and Skia bring their own glyph caches and metrics.
Does the display list carry metric snapshots from the host text shaper, or does the core compute layout from a bundled metric file independent of the backend?
It's interesting to me that the page doesn't describe the size of the rust binary (relevant for mobile app use cases where you would need to add the Rust binary to your app) or performance.
The webpage also does read like it was at least heavily LLM assisted, which makes it a bit hard to trust it.
That all said, this is definitely something I'd be interested in using for Zulip if is indeed going to be a well maintained open source project.
(We currently have a node server component that the Zulip server runs only the render LaTeX).
I suppose somebody has to ask (and most likely complain) about this: what does RaTeX do for accessibility? I gather that it produces images. I checked the demo, and there's not even an alternative text or an ARIA label, so this seems completely invisible to anybody relying on a screen reader. This is quite a step backwards compared to modern LaTeX, which can now tag equations with MathML within the PDF, or using MathJax, or any other serious tool targeting HTML, like pandoc or LaTeXML.
Is accessibility anywhere on the roadmap for RaTeX?
AFAIK, KaTex has accessibility via MathML included, but it doesn't provide an export as image - it rather renders as html. So why would you want the utility that you use to render an additional image to take care of something that KaTeX already does?
The landing page (clearly to a large part written by LLM) does not mention that both KaTeX and mathjax can render to SVG in node. This wasm approach might still be lighter, but the advantage is not as clear as the page makes it seem. (It also contains LLM dishonesties like that the bundle size is 0 KB.)
> It also contains LLM dishonesties like that the bundle size is KB
That one jumped out to me too. The phrasing is so wiggly but technically correct it feels intentional. When I saw it I didn't blame it on the LLM, which is worse.
Otherwise it's a super cool looking project
Where this clearly wins is native rendering use cases where there is no browser or JS engine involved at all.
If RaTeX gets to brag about having a 0 KB JS bundle, the other libraries should be able to say they have 0 KB WASM bundles!
We recently switched from Node.js + Mathjax for rendering latex to Goja (https://github.com/dop251/goja) + Mathjax, and surprisingly it worked really well. We did this because the app is already 99% golang, and this allows us to eliminate the remaining non-go pieces, greatly simplifying the SBOM. And yes, we tried go-latex, but it's not nearly as feature complete as Mathjax. Not to mention using goja + Mathjax adds 10MB to binary size while Node.js adds 200MB+
Interesting. Reminds me of Typst (both implemented in Rust and replacing TeX to some degree) and Microtex.
I've discovered typst in the last year and used to build a resume and cover letter template that feeds from a YAML file.
After a bit of tinkering and understanding the idiosyncracies of Typst, the joy of having reliable, consistent, beautiful, data-driven resumes and cover letters is not measurable. It basically lifted any barrier to applications, while whatever I had before I had always considered a burden.
On top of that, I can add hiring process data directly to the yaml file to run further analysis.
Can LaTeX do this? Most probably, but the learning curve is the difference.
I have been using Typst for creating notes and it is an awesome tool. I use it to create notes on welding for my students. It makes my life so much easier compared to badsoft and its not-word-ing (you understand me).
I greatly prefer Typst's clean architecture than TeX's macro-centric hell pounded into passable utility.
I am so confused. There's already a native version of LaTeX... it's... it's LaTeX. Why would a Rust implementation need to match KaTeX instead of properly implementing a real (and modern, so unicode-out-of-the-box) TeX engine, that LaTeX (which is a set of convenience macros) then trivially runs on top of?
I switched to Typst a year ago and never looked back
Interesting! Would love to see how RaTeX evolves.
> JS bundle (typical) 0 kB JS (core is WASM)
I guess you should mention how much is WASM, right?
I'm sorry but the only thing that truly understands TeX, is, and will forever be, TeX.
Anybody embed it in a markdown renderer yet?