NHacker Next
  • new
  • past
  • show
  • ask
  • show
  • jobs
  • submit
Show HN: Building a web server in assembly to give my life (a lack of) meaning (github.com)
matteohorvath 37 minutes ago [-]
It's a beautiful project, well crafted. To reflect to the other comments, projects like this are more like a Minecraft map for me. There are giant and amazing maps, small survival maps, local hosted for my friends and myself, and commercial focused high scale servers. Building a house, or designing a new road in the server became extremely easy with AI, put the value created in the world depends on the original purpose of the server and whether creating more houses and roads actually makes sense. I think it's a super thing that commercial server can build out faster and be bigger with more houses and roads on it, but The love an art project creates in the world is incomparable.
rdevilla 3 hours ago [-]
Ten years ago, I would have kowtowed to someone elite enough to build something like this.

Today, I just think, "how long would LLMs have taken to write this?"

I mourn the death of a human artform.

wewewedxfgdf 2 hours ago [-]
It's far more exciting than sad.

Got an idea that you'd need assembly language for - now you can do it instead of..... never doing it because it would have been impossible for you in any practical way.

Look to the positive instead of lamenting something that never would have happened.

It's unbelievably exciting that you can now program a computer virtually without the limitation of your ability to hand code it.

nzhsbdb 2 hours ago [-]
The result is unimpressive either way -- it's the journey that is exciting for these kinds of projects
wewewedxfgdf 2 hours ago [-]
I understand for some people its the display of human wizardy that matters.

For me it's about making the computer do awesome things - I do not care how I get there I just want it to do whatever I can conjure in my head.

Thanemate 1 hours ago [-]
As much as I enjoy the novelty of asking anime pictures from chatGPT I do not, for a single moment, consider myself a doer of anime pictures.

And a fair aside, the result will be "good enough" approximation of what I conjured in my head, but never the thing itself. For me to do the exact thing I conjured in my head it will require to pick up the mouse and draw the rest of the owl. I don't know if that's more telling of my imagination being demanding or my standards.

15 minutes ago [-]
behaviors 23 minutes ago [-]
I do believe this is just a next step in languages. We've come this far trying to make code NLP, now we have the closest thing to a translator in our generation. It's an exciting time, just don't pay attention to talking heads.
estebarb 2 hours ago [-]
It has always been possible to do it. LLMs are not a particular enabler for that.

The difference is that now it is worthless: there is no learning, no person caring about the result, nothing aspirational for the public to look towards... we used to enjoy those challenges, used to be proud of solving complex problems... now? Yeah, whatever, execute execute commit push, let another LLM "review" and call it a day.

menzoic 1 hours ago [-]
The difference is not that it’s “worthless”. The difference is that now it’s “practical” to implement given the low effort.

I wouldn’t be sad about defeating lower complexity challenges. There are always higher complexity challenges that arise once we start operating in a world when you can do more. The bar raises.

rdevilla 1 hours ago [-]
The point is the death of the celebration of excellence and technical mastery.

Once insurmountable challenges are now trivial to implement with, as you say, "low effort."

For those who were attracted to computing by the grind and the grand narrative that you, too, with sufficient effort, discipline, and merit, could become a revered craftsman, LLMs trivialize an entire lifetime of effort. I can't think of anything more demoralizing.

lofaszvanitt 3 minutes ago [-]
Yep, another humane thing going to get killed, because people are naive, gullible and basically idiots handing out their expertise on a platter to faceless corpo entities.

What's next, human human contact abstracted away by brain stimulation?

And the transhumanist arsewipes gonna have a field day.

Never too late to ignite the nukes...

locknitpicker 56 minutes ago [-]
> The difference is that now it is worthless

Writing whole software projects in assembly has been worthless and pointless for a couple of decades now. Even the projects who can put together a solid case will limit assembly to very specific components executed only in specific bits of a hot path. Perhaps the most performance-sensitive code we have today is high frequency trading and that field is dominated by C++.

Also, virtually all mainstream compiler suites have flags that output assembly,and that feature is largely ignored and unused.

georgemcbay 2 hours ago [-]
> Got an idea that you'd need assembly language for - now you can do it instead of.....

Nobody actually needs a web server built in assembly language, it serves no practical purpose. And I say that as someone who learned to program 6502 assembly language in 1983 and has sporadically used assembly of various architectures since.

The absurdity of building it would have been the curiosity draw pre-LLMs, but when it existing is just a series of prompts away it really loses all of its meaning.

But yeah... hooray for AI. Can't wait until we learn to harness it to supercharge the most important and valuable thing we do as a human society in modern times: stuff increasingly intrusive ads in front of everyone at all times.

mcintyre1994 1 hours ago [-]
> Can't wait until we learn to harness it to supercharge the most important and valuable thing we do as a human society in modern times: stuff increasingly intrusive ads in front of everyone at all times.

Wasn’t it used for that before anything else? Google invented transformers and had LLMs internally before chatgpt got released. Presumably they were using them for ads, because their public demos were insane things like talking to the moon.

pixelesque 40 minutes ago [-]
> Wasn’t it used for that before anything else? Google invented transformers and had LLMs internally before chatgpt got released.

According to friends who worked at Google (no direct knowledge myself, so don't know exactly how true it is), they mostly sat on the tech. Google News had internal prototypes of using them to expand/contract/summarise and/or add details/context to news articles and translate them to different languages, but it was never fully productised.

Then after ChatGPT got popular, sudden panic to start using them in products company-wide.

dinkumthinkum 2 hours ago [-]
> without the limitation of your ability to hand code it.

Isn't that kind of view pathetic and sad, though? Why would anyone pick up and guitar or play a piano if they could just listen to the same song already made by someone else? I struggle to understand this view of people that pretend to not understand why being an expert of some skill is perceived as valuable by some people. This is also belies next problem with this line of thinking which is that it says "we don't need to learn X to do Y because we have AI" but misses the same AI could easily replace the need to have you think to do Y in the first place. I don't know.

designerarvid 36 minutes ago [-]
I think that the analogy of recorded music best captures your feeling. Not the exact technological and economic transformation that is happening, but the feeling.

Some 120 years ago recordings music was a living phenomena produced in the moment. Musicians worked at restaurants and coffee shops everywhere, being useful without being super stars.

Music didn’t disappear with recordings, but the works is certainly different.

qingcharles 1 hours ago [-]
The answer is "no time at all." I used Gemini Ultra earlier this year to see how well it would do with some really gnarly assembler. I asked it to write a whole flat-shaded 3D engine in 8086 assembler that would run in CGA on an original XT and it one-shotted it in a couple of minutes.

https://imgur.com/a/Dy5rUku

isatty 2 hours ago [-]
Human artform is still alive and well as evidenced by this post.

Yes, an LLM can write it, it’ll probably work. Yet, it’ll remain meaningless slop while this is not.

tapland 2 minutes ago [-]
[flagged]
locknitpicker 60 minutes ago [-]
> Ten years ago, I would have kowtowed to someone elite enough to build something like this.

I'm afraid it's an elite skill in the sense that juggling is also an elite skill. It's impressive for the first few seconds you gaze into it, but once the novelty factor wears off you understand that it's wasted effort that leads to a project that suffers from a massive maintainability problem, is limited in which platforms it can run, and brings no advantage whatsoever. It's an gimmick that has no practice use.

This is the software development equivalent of an amateur guitarist posting shredding videos on YouTube.

altmanaltman 2 hours ago [-]
I get what you mean but I feel this new profound yearning for "hand-crafted" code is getting a bit out of hand. Software engineers have taken shortcuts whenever possible since software was a thing. Do you also mourn that we don't code airplanes by hand anymore (i.e. the death of the "craft of coding").

We need to stop thinking of software as carpenters where the magic is some physical skill and that is the "CRAFT WE MUST PROTECCT".

And at least your comment was grounded in reality; a lot of people I talk to (who are not coders) seem to think a good software engineer writes every line and every word with thoughtful genius and AI just spams code so one is better than the other. And they are convinced its some naunced smart take and they understand software development on a inner level or whatever.

And the base assumption still holds true (pure AI-generated code is garbage) but its mostly because its badly designed and is still a pretty poor architect. And there is a need to pushback against slop but why do we need to elevate typing code as if its some sacred acctivity? Most of the work a good coder does is in their mind with little connection to the phyiscal reality of the world.

Thanemate 3 hours ago [-]
I'm oddly enthusiastic about seeing someone who beings the HACKER in HackerNews. But at the same time, this made me remember the days when display of skill and craftsmanship were rewarded in the industry.

Maybe it's finally time to move on from being a career programmer.

noduerme 3 hours ago [-]
What a dismissive comment. Now that anyone can have an LLM write code for them, the only people who have value to bring to a project are the ones who can improve upon the LLM's output. That is, the ones who have a deep enough understanding of the logic and language. And the only people who will ever be in that position are the ones who take the time and effort, out of sheer curiosity, to learn how things work. Whatever your alternative is to this, there is no future in the alternative.
dwedge 2 hours ago [-]
Artisanal code has a future. Maybe not a high paid one but maybe we go back to roots. if you enjoy programming and were never focused on output or on pipelines, LLM doesn't offer the same ezperience
noduerme 43 minutes ago [-]
Sometime around when wordpress came out, or at least 2005 or so, I started positioning myself as a bespoke web designer, then app developer. Whereas anyone could get a site done, I turned myself to doing things that hadn't been done before, for which standard solutions wouldn't fit. I turned away 80% of jobs and raised my rate from $25 to $100, then to $300/hr. To me, pricing and only doing bespoke work was a defensive measure against falling into a career hole I didn't want to end up in. But mostly it was just that I didn't want to repeat myself or waste my time doing something that a client could already buy off the shelf.

Artisanal code, or bespoke code, has always been the best paid and most satisfying work. If we no longer have a new generation of curious people who enjoy solving hard problems, it's only going to become more valuable.

dinkumthinkum 1 hours ago [-]
I don't see it as dismissive, maybe you two are talking past each other but seem to be on similar side. I think the parent just articulated a sense of resignation that many people probably share. I think you might be saying that maybe there is still some shred to hold onto, possibly.
shevy-java 2 hours ago [-]
I don't see anything dismissive here. It is a realistic assessment: if the choice is between code generated by AI or code generated by a human, and the AI is better in an objective manner, then why should a company employ a human? I refer here solely to the code result; naturally humans may do things AI can not do yet, but if the question is solely about code quality and AIs are better here, then why would that comment be dismissive rather than realistic?

> And the only people who will ever be in that position are the ones who take the time and effort, out of sheer curiosity, to learn how things work.

People learn something new all the time, AI does not learn anything, it just simulates and hallucinates. But the core question is not addressed with that. What would you do if you have to compete against AI, and AI is better? We already see these with the new generation of humanoid robots from China. Those things make Boston Dynamics robots look like tinker-toys in comparison - already as-is. Give it ten more years and we finally reached AI skynet for real.

noduerme 1 hours ago [-]
What do you mean when you say AI code is better? I am looking at AI code all day and it's just garbage that happens to work for whatever feature was requested... in no way is it better code. Any human who was so careless as an AI to commit such atrocities would be fired.
stbev 2 hours ago [-]
I am attempting to write a software renderer in WebAssembly because, for some reason, I feel the need to go against the direction this vibe coded world is going, and I want to feel challenged again. I don't know if I will ever finish it, it is crazy, and by no means useful. But gosh it feels so good.

Congratulations to the OP for the accomplishment.

polaris64 7 minutes ago [-]
I did exactly the same and it was so much fun. It wasn't about bringing anything novel to the table, it was just a fun challenge for myself. I finished and now I'm writing a game using it, although now the challenge has gone I am not making much progress on that. But never mind, I had fun! I wouldn't have had that fun or satisfaction if I had vibe coded it instead.
qingcharles 1 hours ago [-]
As in 3D software renderer? I cut my teeth on those throughout my teens and the start of my professional career, in x86 and C.

I wanted to see how an LLM would do writing one in pure 8088 assembler for CGA and it one-shot a nice demo (I fed it the vectors for the Elite ship in the prompt):

https://imgur.com/a/Dy5rUku

stbev 4 minutes ago [-]
Yes, exactly, a 3D software renderer. But the goal is to do (almost) everything from scratch and by hand. No LLMs, no std library, no compilers. Just a few imported math functions (such as sin and cos). Not the same as bare metal programming but close
PacificSpecific 2 hours ago [-]
Please post your progress! That sounds cool as hell
stbev 3 minutes ago [-]
Thank you! I will keep working on it and post something here
tgma 1 hours ago [-]
If you actually start writing big stuff in assembly, esp a macro-assembler, you'd quickly realize it is more verbose, but not fundamentally that different from higher level programming. You basically need to get a hang of how to build abstractions with procedures and macros and you'd be good to go. Reading assembly effectively is often much harder than writing it.
imtomt 1 hours ago [-]
Yeah, that's what I realized during this, too. You need to be much more explicit, but the way any given function works isn't fundamentally different. "strlen" will always iterate through a string searching for a NULL byte whether it's in C, Rust, Assembly, or whatever other language. I think it can feel almost more straightforward than other languages, since you're laying out exactly what the CPU needs to do, in what exact order.
sureglymop 56 minutes ago [-]
Agreed. And super cool project. After seeing Matt Godbolts Advent of Compiler Optimisations in December I decided to do AoC in assembly. Was the most fun I had in years even though I didn't finish all days!

And super educational. Since then I've been pondering which problems require dropping down to the assembly level. E.g. implementing a JIT compiler, a coroutine runtime, etc.

behaviors 1 hours ago [-]
Well done. Been working on a similar smaller project for RISC-V. This is excellent
imtomt 1 hours ago [-]
That's so cool! I would love to see it if you're sharing it anywhere.
behaviors 14 minutes ago [-]
It's a HTML browser for Pi Pico2, CLI, meant to support my in-house project running on a mesh of Pico2's. I really wanted to use RISC-V and it needed a webhook that serves a page on PIO wake. We are at the browser is written about 60%. The server is now already handled ;P I found this awesome project someone posted on HN. When I complete my project the browser will be released alongside. You can very likely reproduce it with less than a handful of prompts. One thing I really do believe, ideas are going to be the next open-source. LLM's can make ideas into things.
trollbridge 5 hours ago [-]
Gave me a warm feeling to know that someone would actually still bother to do this by hand. I'm not the only one!
imtomt 5 hours ago [-]
Thank you! I've been obsessed with this idea for a while, finally decided to start on it, then obsessed over it for a couple weeks. I'd love to see some of your projects if you have anything similar, I'm glad I'm not the only one too! I think most programmers would benefit a lot from taking a few weeks or months to try and learn some assembly, and demystify how CPUs and compiled languages work.
dddddaviddddd 3 hours ago [-]
Even though it's a meaningless comparison, I'd be interested to see how performance compares (max requests per second?) for this compared to fully-featured web servers.
imtomt 3 hours ago [-]
Honestly haven't benchmarked it, but I would imagine ymawky would be considerably slower than most fully-featured web servers. ymawky uses fork-per-connection, which is fundamentally slower than what production servers like nginx or Apache use. nginx uses event-driven IO (kqueue/epoll), which can handle thousands of concurrent connections without the overhead of forking the process on each request. Apache uses pools of threads which handle multiple connections without needing to be spawned per-request. A head-to-head against any other web server would mostly measure "fork-per-connection vs event loop/thread pools", which assembly has nothing to do with.

In a comparison between a similar fork-per-connection server written in C and this, I would imagine the throughput would be about the same, because the bottleneck in this model is fork() itself rather than the actual code. It probably matters more for binary size and startup time than requests/sec. Would be fun to actually benchmark, though.

arrty88 3 hours ago [-]
Should i ask my Claude to benchmark it against nginx or will you ask yours
chrisweekly 5 hours ago [-]
That fake O'Reilly book cover is pure gold.
imtomt 5 hours ago [-]
That book is exactly what inspired me to make this in the first place, haha. The subtitle of the book gave me the acronym I named it.
____tom____ 3 hours ago [-]
Fauxreilly!
dragontamer 2 hours ago [-]
Hmmmm.

One of my first assembly projects was a CGI Script 100% in x86 assembly.

A full web server is certainly more impressive! Though I'd recommend to beginners to look up CGI and mod_cgi in Apache first lol

imtomt 6 minutes ago [-]
Woah! I honestly feel more intimidated writing a CGI script in assembly than I was writing a server, lol. CGI support has been on my mind for a couple weeks, but I haven't really dug into it yet. I'd love to see yours if it's hosted anywhere! Could be a great reference when I do.
dalleh 2 hours ago [-]
With the bubble of LLMs, these projects are really appreciated. Keep up the good work!

P.S.: I would love a copy of that book please!

marc_g 3 hours ago [-]
This is cursed and wonderful. I especially appreciate status code 418. I hope I run into that in the wild one day, then I'll think of you!
ybouane 3 hours ago [-]
We are moving to AI and stopped writing code / scratching our heads, and you're here writing a web server in assembly.

Humbling.

dwedge 2 hours ago [-]
Yeah, humbling - I know which path I prefer
thatxliner 5 hours ago [-]
I'm wanting to read this repository as a learning tool, so it'd also be nice to include docs—even AI-generated docs, but obvious I'd prefer docs with your own design notes and decisions—about the architecture of the code.

Really cool project though!

imtomt 5 hours ago [-]
Thanks, I appreciate it a lot! I tried to comment my code pretty heavily (~3000 lines of code, ~1000 lines of comments all together), since this was a learning project for myself in the first place. Hopefully those will be of some use. But separate in-depth documentation is definitely a good idea, I'll work on adding that. In the meantime I'm always down to answer any questions about it!
thatxliner 5 hours ago [-]
My first question would be where should I start reading? It seems like you modularized it into multiple assembly files (how does that even work?)
imtomt 5 hours ago [-]
Honestly, read the main file, ymawky.S first. Then I'd read through get.S maybe, checking parse.S on an as-needed basis for parsing-related functions. delete.S or options.S are pretty short, too, so give those a read too.

Modularizing it into multiple files was easier than I expected it to be, you basically have other functions/labels in other files, and mark them as .global at the top. The Makefile compiles each file into their own .o, which you then link all together. You can "b" or "bl" to any label from any other file, as long as it's global and linked together. Same with data in .bss or .data, mark them as .global and they can be accessed from elsewhere.

5 hours ago [-]
vasco 3 hours ago [-]
If you'd be happy with that then you can generate them yourself!
Ati985 3 hours ago [-]
Your determination to make this happen was remarkable — and you truly accomplished it. Congratulations
mappu 3 hours ago [-]
Syscalls on macOS aren't guaranteed to be stable - Go found out the hard way and in 1.12 they changed to call libSystem.dylib instead.

In general, stable syscall numbers are just a Linux thing. Everyone else uses blessed system libraries

imtomt 3 hours ago [-]
Yeah, I know MacOS syscalls aren't stable. Interesting point about Go, I hadn't heard about that. Unfortunately I'm a masochist though, and want to avoid libSystem.dylib as much as possible. The only reason I link against it at all is because MacOS requires it for executables to run, I never actually call into it. Figured I'd just update the syscall numbers if/when they change.
cylinder714 4 hours ago [-]
Here's a piece on writing portable ARM64 assembly: https://ariadne.space/2023/04/12/writing-portable-arm-assemb...
imtomt 4 hours ago [-]
Thanks for the link, bookmarking. I should note ymawky's main portability issues are unfortunately at the syscall layer rather than the asm layer. proc_info() and getdirentries64() are pretty Darwin-specific, so making it portable would require reworking that whole area rather than adjusting register/calling conventions.
AppAttestationz 2 hours ago [-]
I suspect that the test suite isn't great. Bun has so many different behaviors compared to other JS engines, sometimes just plain wrong or contradicting the spec. Test suite didnt catch those..
niftynanometer 35 minutes ago [-]
Insane
_the_inflator 5 hours ago [-]
I feel the guy’s suspicion towards any high level language. I exclusively programmed in assembly on C64, Amiga and the recognized that this ain’t sustainable on PC because there are more and more edge cases or different machine configurations.

I had a very hard time simply using and even utilizing C++ or Java.

C and Turbo Pascal especially was easier because the compiled code was very much resembling to hand written code.

As the author described, you can do in 4.000 lines what others can do with way less pain in 100.

So you build macros, come up with your own library and in the end you kind of build a meta language build on top of assembly because some lines are so hard to grasp that you delegate working code into a library for reuse.

It is funny how much we take conventions for numbers for granted. If you happen to know assembly and its intricacies you immediately will learn to work with a sign bits which mark negative numbers. But how do you know? Maybe you use the whole addressable space only for positive numbers.

Small things that make a huge different.

Nice article, I enjoyed your adventures and would do the same.

imtomt 5 hours ago [-]
Thank you! The thing about eventually building your own meta language ends up happening all the time with bigger assembly projects. I do have a fair few quality-of-life macros too, but probably fewer than I should. I did end up needing to implement by hand what would be standard functions, things like atoi, itoa, strlen, memcpy, streqn.

Higher level languages are more convenient for 99% of things, but the directness of Assembly gives me a rush unlike any other. I didn't live through the C64/Amiga, but I was obsessed with old C64/ZX emulators growing up.

qingcharles 1 hours ago [-]
I don't know. Certainly the PC had a lot of options, but it wasn't impossible. My first piece of commercial software was written entirely in x86 assembler and had to navigate things like graphics card options and multiple sound card options. It could be done, it was just a lot more of a PITA.

Once I was doing 3D I quickly started moving everything but the inner loops to Turbo C, because I'm not a total masochist :)

digitaltrees 5 hours ago [-]
I don’t know why, but this project has me irrationally excited!
shevy-java 2 hours ago [-]
If it is written in assembly, why is it for MacOS only?
DavidPiper 1 hours ago [-]
Assembly for the correct architecture is only one part of getting an executable running on a machine.

- Dynamic libraries (e.g. for calling into the kernel, but also user space dynamic libraries) are OS-specific (.so for Linux, .dylib for macOS, .dll for Windows)

- Executable format is OS-specific (ELF for Linux, Mach-O for macOS, PE for Windows)

- Dynamic loading and linkage of both the above are also therefore OS-specific

washingupliquid 3 hours ago [-]
Didn't Steve Gibson do this like 25 years ago? AFAIK his "Shields Up" site is written in Win32 assembly.
eptcyka 3 hours ago [-]
Then it is unlike this, as this is written in arm64, not x86, and not for Win32.
rogeliodh 5 hours ago [-]
Awesome. Any resource recommendations to learn ARM assembly?
imtomt 4 hours ago [-]
Honestly, just reading existing assembly to get a feel for how it works, and then violently googling everything that goes wrong. The ARM Architecture Reference Manual (aka "The ARM ARM") ended up being really helpful for looking up what specific instructions do and how they're called. Another really helpful tool is writing something in C/C++, and compiling with "gcc -O1 -S file.c" to see the assembly gcc generated. It helps to mess around a lot with smaller programs in gdb or lldb.
zzz6519003 4 hours ago [-]
[dead]
bananaboy 4 hours ago [-]
This is amazing, great work! I love it!
boring-human 4 hours ago [-]
Even after we've all retired (pretty soon for those who can afford it) or transitioned out of software engineering (for those who can't), we'll still get to amuse each other with home-brew projects like this. Warm fuzzy feeling - I'll take it!
imtomt 4 hours ago [-]
Thank you! This is one of the nicest things I've heard in a while.
arrty88 3 hours ago [-]
Love this so much.
JSR_FDED 4 hours ago [-]
This is a great resource, thank you!

The last time I did anything in assembler was x86 under DOS. Your code makes ARM64 with a modern OS less scary than I thought it would be.

t-3 3 hours ago [-]
Arm is very nice to write assembly for. Having a proper load/store register-centric architecture rather than a stack-centric like x86 makes the mental load of writing code go waaay down, so the attractiveness of HLLs for ease of writing code is greatly diminished on RISC.
xyst 4 hours ago [-]
Need a straight binary port now
imtomt 4 hours ago [-]
Why stop there? Next, I'm prying open a CPU and poking the transistors with a 9V battery and paperclips to make it execute what I want. Slower, but you get so much control.
nunez 3 hours ago [-]
Where's your SKILLS.md? How did your agents make this?

jk. Metal as fuck. Love it.

imtomt 3 hours ago [-]
Ahh you caught me. I just kept telling ChatGPT dot com "no, make it less efficient" and copied whatever output it gave me. jk, thank you!
jjbigs 5 hours ago [-]
This is fucking nuts
faangguyindia 4 hours ago [-]
I've used Python (django/flask/fast api), Java (springboot), Ruby on Rails for writing web applications and APIs.

Nothing beats Go.

When you use HTMLX (goat) + sqlc (goat) + pgx (another goat) + Chi (yet another goat) and Sqlite (goat).

Most apps will not need anything more than Sqlite, i've several sqlite apps doing a couple of million visits per day.

Compiles to signal binary blazingly fast.

Deploy using systemd service, capture logs with alloy / Loki graphana setup, set up alerts and monitoring and go home.

And you can serve millions of requests on a server with 512MB RAM.

I don't think you'd ever need more speed than this.

Everything else is bloated, slow and doesn't give you enough room for optimization.

Here's the latency of one of my hobby projects (network latency not included): https://i.ibb.co/hJ6FQtyw/d3d6c9d15765.png

Request rate: https://i.ibb.co/Fq80nfJ4/67fcdbdb7491.png

It's running in US and EU (helps avoid atlantic routrip tax), in this one i am doing some 100s of checks, not simple CRUD work. With Go you can optimize a lot without complexity of Rust.

losteric 4 hours ago [-]
Can you share what some of those apps are?
arrty88 3 hours ago [-]
I’ve written all of these languages and more professionally. I agree none match the speed and simplicity of golang. Go is that efficient.
iamgopal 4 hours ago [-]
How are you merging sqlc and pgx with sqlite ?
lelandbatey 4 hours ago [-]
Specifically how can you use pgx with sqlite while pgx is a postgres-specific library? Sqlc works great with Postgres or Sqlite, Sqlc works with pgx when connecting to Postgres, but pgx can't be used with Sqlite AFAIK
sampullman 4 hours ago [-]
Did you reply to the wrong submission?
4 hours ago [-]
tonyedgecombe 4 hours ago [-]
They replied to the title.
plusplusungood 4 hours ago [-]
What's your point?
faangguyindia 4 hours ago [-]
You don't need to use assembly for high performance web app when you can just use Go.
ericbarrett 4 hours ago [-]
You don't, but it's so much cooler when you do! Not everything needs to be a beige utilitarian module optimized for business consumption.

I didn't need to implement an Intel RDRAND streamer in C and assembler, but it was a ton of fun: https://github.com/ehbar/rdrand-stream

OP, I really liked this project. Kudos for publishing it!

imtomt 3 hours ago [-]
Woah, that's really cool! I'm glad you did that even if you didn't need to. I honestly think everyone needs to write more assembly, because it's so much cooler.
imtomt 6 hours ago [-]
This post seems to now link to the writeup rather than the repository, sorry! The repo can be found at the top of that page, or directly here: https://github.com/imtomt/ymawky
dang 5 hours ago [-]
Whoops that was my fault. Fixed now. (I emailed you, btw, that we'd changed your title, but I forgot to switch the URL back to the repo. Both links are cool.)

I'm sure I'm not the only one who has fantasized about doing something like this as a self-soothing enterprise. Kudos to you for actually doing it!

imtomt 5 hours ago [-]
Hey, thank you! Means a lot. It's an odd sort of meditation, but is surprisingly the most almost-therapeutic project I've worked on. Something about the constraints of Assembly that really pull you into the minutiae and clears your head, maybe.
vladsiu 2 hours ago [-]
[dead]
paolatauru 2 hours ago [-]
[dead]
feiz45607 4 hours ago [-]
[flagged]
Ami985 3 hours ago [-]
[dead]
ankur-ag 2 hours ago [-]
[dead]
feiz45607 4 hours ago [-]
[flagged]
OutOfHere 5 hours ago [-]
An agentic LLM should be pretty good at Arm64 assembly generation, but maintainability of large code could become an issue. Why would it not run on Linux?
imtomt 4 hours ago [-]
I wrote it for MacOS because I don't have a Linux machine right now :( Once I get one up and running again, I'll probably work on porting this.

As for why it wouldn't run on Linux, there are some pretty big differences in the actual assembly. One pretty superficial difference is calling conventions -- MacOS uses the x16 register for syscall numbers, Linux uses x8. Calling the kernel in Mac uses "svc #0x80", in Linux it's "svc #0". That's ~120 lines that need to be replaced, but easy enough to just use sed. Syscall numbers are all different, as are the struct layouts for sigaction(), MacOS has an "sa_tramp" field that Linux doesn't have. Enforcing max processes is done here using the MacOS-specific proc_info() syscall, which can be used to get the number of children any given process has. Linux doesn't have an equivalent, so process tracking would need to be done differently. Finally, Linux has the getdents64() syscall, rather than getdirentries64(), which uses a different struct and is called differently.

I'm sure an LLM could make all those changes, but it's a pretty large codebase, so it would probably make some mistakes or miss things.

tgv 3 hours ago [-]
You could always start in a virtual environment.
shepherdjerred 5 hours ago [-]
The first paragraph of the README says this was hand written so I’m not sure why you’re bringing up LLMs
OutOfHere 35 minutes ago [-]
Because it's absurd for most people to write too much Assembly by hand.
maomaoati985 3 hours ago [-]
Your determination to make this happen was remarkable — and you truly accomplished it. Congratulations
Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact
Rendered at 08:40:57 GMT+0000 (Coordinated Universal Time) with Vercel.