The site configured at this address does not contain the requested file.
If this is your site, make sure that the filename case matches the URL as well as any file permissions.
For root URLs (like http://example.com/) you must provide an index.html file.
Read the full documentation for more information about using GitHub Pages.
ordu 1 hours ago [-]
Parts 5 and 6 are 404. At the end of the article there are words:
"More pictures in the next part.
Next part: coming soon"
I suppose the link came to HN a bit too early.
xracy 5 hours ago [-]
This is really cool and impressive... but relatedly...
Has anyone figured out what the minimum specs for Quake are?
I feel like the first thing everyone does with a computer is to determine whether or not it can run quake, and I'm just wondering what the like, most simple computer that could exist is, that could run quake?
klodolph 4 hours ago [-]
You can find a lot of discussion about what the minimum specs for Quake are. Famously, it needs a decent FPU, and the Pentium was a convenient early CPU with a decent built-in FPU. It was significantly faster than a 486.
…But people have managed to run Quake on the 486.
And the myth people tell about Quake is that it killed Cyrix, because Quake performance on Cyrix was subpar. But was that true? And if it was true, was that because the Cyrix was slower than a Pentium, or was it because the Quake code had assembly that was hand-optimized for the Pentium FPU pipeline?
Anyway. “Most simple computer that could run Quake” is probably going to include a decent FPU. If you are implementing something on an FPGA, you can probably get somewhere around 200 MHz clock anyway. At which point you can run Quake II.
jasonwatkinspdx 2 hours ago [-]
My perspective from being a teen doing lan party stuff at the time: Quake ran slow on them, but it was far from the only thing that ran slow. Cyrix was well understood to be the value brand for general office apps and such, but not up to it for more demanding computing, and for having random compatibility issues here and there.
Ultimately what killed Cyrix is they just couldn't offer enough of a discount vs intel to matter, especially with all the lock in stuff intel was doing with Dell, Gateway, etc.
Intel Inside was a successful marketing campaign as well. If you were around back then I bet you can imagine the jingle/chord immediately.
29 minutes ago [-]
polpo 2 hours ago [-]
I had a Cyrix 6x86 when Quake first came out. My disappointment at how poorly Quake ran on it was significant, especially because pretty much every other game at the time ran well on the Cyrix. The FPU performance in Quake was doubly handicapped on the Cyrix: not only was its FPU slower than the Pentium's to begin with, Quake's code was indeed hand-optimized for the Pentium's FPU pipeline. Fabien Sanglard's writeup of Michael Abrash's optimizations for Quake goes into great detail: https://fabiensanglard.net/quake_asm_optimizations/
NooneAtAll3 4 hours ago [-]
can it be rewritten to use fixed point arithmetic instead?
klodolph 3 hours ago [-]
I want to look at this from a different perspective… a single-precision floating-point multiply is pretty simple, no? 24x24 bit multiply, which is about half as many gates as a 32x32 bit multiply.
Maybe I would prefer to rip out the integer multiplication unit first, before ripping out the FPU.
Narishma 4 hours ago [-]
The PS1 doesn't an FPU but got a version of Quake 2, so it's possible. That said, it was somewhat different from the PC version, so it could be argued that it's not the same game.
klodolph 3 hours ago [-]
The PS1 version definitely has its own engine, which is not just a port of the Quake 2 engine to the Playstation, but a new engine.
jasonwatkinspdx 2 hours ago [-]
I can't speak on Quake, but I was a level designer on the failed effort to port Unreal to PSX.
My understanding from talking to the coders at the time was that Unreal's software renderer was a huge advantage as a starting point. They were able to reuse a lot of the portal rendering stuff as setup on the R3K cpu, but none of the rasterization. That had to go to the graphics core, which was a post setup 2D engine that in addition to the usual sprites, could do tris and quads.
We had a budget of about 3k polygons post clipping, and having two enemies on screen would burn about half of that. The other huge limit is the texture cache was tiny, so we couldn't do lightmaps. Our lightning was baked in at vertex level and it just was what it was.
I imagine the situation with Quake was comparable. The BSP stuff would carry right over, but I can't imagine they got lightmapping proper working at the time. They'd also need some sort of solution for overdraw, as Quake's PVS was a lot more loose than Unreal's portal clipping.
apgwoz 43 minutes ago [-]
I played Quake on a 486 66Mhz DX2 with 16MB of RAM in the 90s. On the lowest resolution, but it was fine.
conception 5 hours ago [-]
That’s only because everything can run Doom now.
markus_zhang 6 hours ago [-]
This is very impressive. How did you learn to design a real computer, not the toy ones a lot of people made? I read part 1 and part 2 and looks like you just “thrown in” Ethernet and other stuffs and it was done. Really hope to learn from the process, thanks!
5 hours ago [-]
argulane 3 days ago [-]
That's some mad dedication to go from kicad schematics to running Quake. Very impressive!
UncleOxidant 5 hours ago [-]
Cool! Have you considered offering this board on Crowd Supply or similar? There don't seem to be many boards available for Efinix FPGAs.
bee_rider 4 hours ago [-]
Quake 2 was the one with the clever approximate inverse square root code, right? I wonder (especially since there’s an instruction nowadays to draw inspiration from), can you implement it “in hardware,” so to speak?
I have the album on my phone. When I get called in to put out a fire and save the day, I like to put on March Of The Stroggs in the car when arriving at the destination. It's a great soundtrack for two reasons - the first one is sweet, wasted youth and the second is it's a great soundtrack.
skewbone 1 hours ago [-]
And Jer Sypult who made Climb (track 10) who is not in Sonic Mayhem!
phendrenad2 3 hours ago [-]
Hey, routing your own length-matched traces, nice. Is this Altium?
absynth 3 hours ago [-]
Another board has become Frag complete. Important milestone!
Rendered at 05:09:37 GMT+0000 (Coordinated Universal Time) with Vercel.
URL: https://blog.mikhe.ch/quake2-on-fpga/part6.md
404 File not found
The site configured at this address does not contain the requested file.
If this is your site, make sure that the filename case matches the URL as well as any file permissions. For root URLs (like http://example.com/) you must provide an index.html file.
Read the full documentation for more information about using GitHub Pages.
"More pictures in the next part.
Next part: coming soon"
I suppose the link came to HN a bit too early.
Has anyone figured out what the minimum specs for Quake are?
I feel like the first thing everyone does with a computer is to determine whether or not it can run quake, and I'm just wondering what the like, most simple computer that could exist is, that could run quake?
…But people have managed to run Quake on the 486.
And the myth people tell about Quake is that it killed Cyrix, because Quake performance on Cyrix was subpar. But was that true? And if it was true, was that because the Cyrix was slower than a Pentium, or was it because the Quake code had assembly that was hand-optimized for the Pentium FPU pipeline?
Anyway. “Most simple computer that could run Quake” is probably going to include a decent FPU. If you are implementing something on an FPGA, you can probably get somewhere around 200 MHz clock anyway. At which point you can run Quake II.
Ultimately what killed Cyrix is they just couldn't offer enough of a discount vs intel to matter, especially with all the lock in stuff intel was doing with Dell, Gateway, etc.
Intel Inside was a successful marketing campaign as well. If you were around back then I bet you can imagine the jingle/chord immediately.
Maybe I would prefer to rip out the integer multiplication unit first, before ripping out the FPU.
My understanding from talking to the coders at the time was that Unreal's software renderer was a huge advantage as a starting point. They were able to reuse a lot of the portal rendering stuff as setup on the R3K cpu, but none of the rasterization. That had to go to the graphics core, which was a post setup 2D engine that in addition to the usual sprites, could do tris and quads.
We had a budget of about 3k polygons post clipping, and having two enemies on screen would burn about half of that. The other huge limit is the texture cache was tiny, so we couldn't do lightmaps. Our lightning was baked in at vertex level and it just was what it was.
There's a bit more info here: https://www.terrygreer.com/unrealpsx.html
I imagine the situation with Quake was comparable. The BSP stuff would carry right over, but I can't imagine they got lightmapping proper working at the time. They'd also need some sort of solution for overdraw, as Quake's PVS was a lot more loose than Unreal's portal clipping.
https://youtu.be/Zdy9TtInX-c
Lots of Quake II samples.