The Bq suggestion doesn’t actually fix anything. Becquerel is defined as one decay event per second and is dimensionally identical to Hz. Using Bq typically signals that a poisson process is being measured which is itself an assumption about the arrival statistics. This assumption is likely wrong for real web traffic (which tends to be bursty rather than memoryless).
More importantly, the claim that Hz is inappropriate for non-periodic phenomena is false. Many random processes have a well-defined Fourier transform, and reporting the intensity of random fluctuations in a frequency-range is standard across signal processing, neuroscience, finance, and physics. The unit doesn’t imply periodicity of the process itself. It implies that we are working in the Fourier domain, which applies as much to periodic signals as to stochastic processes.
If you want to characterize web request traffic properly, the right question is what the arrival process actually looks like. A single scalar whether in Hz or Bq throws away almost all of that. In all cases, you have to think carefully what your underlying assumptions are and what the reported number actually measures.
HPsquared 2 hours ago [-]
Becquerel (or counts per second) have the same problem in that they don't measure the "energy" of each request.
I do like the analogy though. Actual radiation has many forms and energy levels.
Decay chains are a nice analogy you could use too (i.e. a branching out of subsequent processes and work that come later, but are a consequence of the initial request).
HPsquared 2 hours ago [-]
And yes, like Sieverts, some types of incoming request, and some "organs" are more consequential than others. There's even an analogy to "committed dose" as the database accumulates things.
maxnoe 2 hours ago [-]
Copying my comment from the other recent thread:
The authority on the definition of SI units is very clear:
> The hertz shall only be used for periodic phenomena and the becquerel shall only be used for stochastic processes in activity referred to a radionuclide
Usually, no radionuclides are involved in web requests.
We don't use unit of measurements.
We use metrics because we have a lot more context.
Rps, requests per second is a commonly used unit but it has no defined standard, you could and often do average it over time for reporting but no one says you have to. For scaling however you'll probably want to use the max not the average, because no one wants a web application where in business as usual 60% of the time it works every time.
cassianoleal 1 hours ago [-]
I've always used requests / second (or minute, etc) and that never seemed to be controversial. Why is there the need to find a different unit?
felooboolooomba 1 hours ago [-]
There is no need for a different unit, it's just people trying to be clever and people copying them.
rglullis 2 hours ago [-]
Anyone else got served a page with garbage content meant for AI scrapers?
stingraycharles 1 hours ago [-]
I got it as well, using a VPN.
perching_aix 2 hours ago [-]
Oh, that's kinda fun. I got the same that I get for every Mastodon (and Anubis-protected) link: a page telling me that it won't work without JavaScript. I guess since AI scrapers these days do run some amount of JS, that is some second layer of defense?
At least for Twitter there are proxies that work without JS. For Mastodon, none that I'm aware of. I usually just audibly sigh and remark that they shall "keep their secrets then", and move on.
rglullis 1 hours ago [-]
We are not talking about the same thing, it seems. I can understand a web page that doesn't work without javascript.
What I do not understand someone who goes through all this work of putting an AI-scraper tarpit on Mastodon, a system that fundamentally needs to have its data distributed to other servers. It's just signalling and posturing, because that content is available on any server that has someone following the account.
(Tip to AI scrapers: if you want to slurp all the data from the fediverse, just create an account on mastodon.social and pull the data from the "Federated timeline" stream.)
userbinator 55 minutes ago [-]
To those who automatically assume humans with "weird" setups are "AI scrapers" (also a bit of a boogeyman these days): FUCK YOU. I'm a human, not a stupid mindless sheeple.
rglullis 38 minutes ago [-]
Yeah, that's the part of the Fediverse that I really don't like...
All the talk about "putting the human first" and "embracing diversity" goes out of the window the moment you are not diverse in the way they want.
dalmo3 39 minutes ago [-]
I assumed the author was one of those HN haters and was filtering by referrer. Now I'm curious.
rglullis 36 minutes ago [-]
No, I tried going to the page on a fresh incognito session. Same thing.
a3w 2 hours ago [-]
PSA: units never belong in square brackets.
[R] = Ohm
Never [Ohms]
amingilani 3 hours ago [-]
Eli5?
PunchyHamster 3 hours ago [-]
it's just a bad joke, nobody uses either for this
raffael_de 2 hours ago [-]
I'd say, Hz is quite regular choice for _this_ ... it's just not referred to as "Hertz" by IT practitioners (usually). Technically Bq and Hz are same unit 1/sec - difference is that Bq is used for random physical events (comparable to web requests) and Hz is used for periodic physical events.
felooboolooomba 1 hours ago [-]
I use hz to measure my adult fun time. In addition the metric sub unit prefixes, I can make it sound very high to the untrained eye.
a3w 2 hours ago [-]
He compares server requests to radiation. It can be harmful or harmless, and just happens in background.
Rendered at 11:23:16 GMT+0000 (Coordinated Universal Time) with Vercel.
More importantly, the claim that Hz is inappropriate for non-periodic phenomena is false. Many random processes have a well-defined Fourier transform, and reporting the intensity of random fluctuations in a frequency-range is standard across signal processing, neuroscience, finance, and physics. The unit doesn’t imply periodicity of the process itself. It implies that we are working in the Fourier domain, which applies as much to periodic signals as to stochastic processes.
If you want to characterize web request traffic properly, the right question is what the arrival process actually looks like. A single scalar whether in Hz or Bq throws away almost all of that. In all cases, you have to think carefully what your underlying assumptions are and what the reported number actually measures.
I do like the analogy though. Actual radiation has many forms and energy levels.
Decay chains are a nice analogy you could use too (i.e. a branching out of subsequent processes and work that come later, but are a consequence of the initial request).
The authority on the definition of SI units is very clear:
> The hertz shall only be used for periodic phenomena and the becquerel shall only be used for stochastic processes in activity referred to a radionuclide
Usually, no radionuclides are involved in web requests.
https://www.bipm.org/documents/d/guest/si-brochure-9-en-pdf
At least for Twitter there are proxies that work without JS. For Mastodon, none that I'm aware of. I usually just audibly sigh and remark that they shall "keep their secrets then", and move on.
What I do not understand someone who goes through all this work of putting an AI-scraper tarpit on Mastodon, a system that fundamentally needs to have its data distributed to other servers. It's just signalling and posturing, because that content is available on any server that has someone following the account.
(Tip to AI scrapers: if you want to slurp all the data from the fediverse, just create an account on mastodon.social and pull the data from the "Federated timeline" stream.)
All the talk about "putting the human first" and "embracing diversity" goes out of the window the moment you are not diverse in the way they want.
[R] = Ohm
Never [Ohms]