Without arguing the merits of the Altera investment or divestment, a common pattern for Intel seems to be a wild see-sawing between an aggressive and a defensive market posture - it’s a regular occurrence for Intel to announce a bold new venture to try to claim some new territory, and just as regular that they announce they’re halting that venture in the name of “consolidating” and “focusing on their core.” The consequence is that they never give new ventures time to actually succeed, so they just bleed money creating things they murder in the cradle, and nobody born before last Tuesday is investing in bothering to learn the new Intel thing because its expected lifespan is shorter than the average Google product.
Intel either needs to focus or they need to be bold (and I’d actually prefer they be bold - they’ve started down some cool paths over time), but what they really need is to make up their goddamn minds and stop panicking every other quarter that their “ten-year bets” from last quarter haven’t paid off yet.
thunder-blue-3 15 hours ago [-]
Speaking from personal experience, many director-level and above positions at Intel, especially in growth related areas are filled through nepotism and professional connections. I've never seen a headline about Intel’s decline and thought, 'Wow, how could that happen?'
PaulHoule 14 hours ago [-]
I had a business partner that I agreed on a lot of things with but not about Intel. My assumption was that any small software package from Intel, such as a graph processing toolkit, was trash. He thought they could do no wrong.
Intel really is good at certain kinds of software like compilers or MKL but my belief is that organizations like that have a belief in their "number oneness" that gets in their way of doing anything that it outside what they're good at. Maybe it is the people, processes, organization, values, etc. that gets in the way. Or maybe not having the flexibility to know that what is good at task A is not good at task B.
f1shy 9 hours ago [-]
I saw always intel as a HW company making terribly bad SW. Anywhere I saw intel SW I would run away. Lately I used a big open source library from them, which is standard in the embedded space. Work great, but if you look the code you will be puking for a week.
john01dav 7 hours ago [-]
In my experience Intel's WiFi and Bluetooth drivers on Linux are, by far, the best. They're reliably available on the latest kernel and they actually work. After having used other brands on Linux, I have no intention of getting non-intel WiFi or Bluetooth any time soon. The one time that I found a bug, emailing them about it got me in direct contact with the developers of the driver.
I had a different non-Intel WiFi card before where the driver literally permanently fried all occupied PCIe slots -- they never worked again and the problem happened right after installing the driver. I don't know how a driver such as this causes that but it looks like it did.
travisgriggs 32 minutes ago [-]
Interesting. Does Bluez fall under that umbrella?
I have found bluez by far the hardest stack to use for Bluetooth Low Energy Peripherals. I have used iOS’s stack, suffered the evolution of the Android stack, used the ACI (ST’s layer), and finally done just straight python to the HCI on pi. Bluez is hands down my least favorite.
bayindirh 6 hours ago [-]
Yes, their open source drivers had a painful birth, but they are good once they're sanded and sharpened with the community.
However, they somehow managed to bork e1000e driver in a way that certain older cards sometimes fail to initialize and require a reboot. I have been bitten by the bug, and the problem was fixed later by reverting the problematic patch in Debian.
I don't know current state of the driver since I passed the system on. Besides a couple of bad patches in their VGA drivers, their cards are reliable and works well.
From my experience, their open source driver quality does not depend on the process, but on specific people and their knowledge and love for what they do.
I don't like the aggressive Intel which undercuts everyone by shady tactics, but I don't want them to wither and die, either, but seems like their process, frequency and performance "tricks" are biting them now.
1oooqooq 3 hours ago [-]
that's only because their hardware is extremely simple.
so the driver have little to screw up. but they still manage to! for example, the pci cards are all broken, when it's literary the same hardware as the USB ones.
_zamorano_ 5 hours ago [-]
The team working on their Realsense depth cameras was doing great work on the SDK, in my opinion.
Frequent releases, GitHub repo with good enough user interaction, examples, bug fixing and feedback.
throwaway2037 10 hours ago [-]
> such as a graph processing toolkit
This is oddly specific. Can you share the exact Intel software toolkit?
> "number oneness"
Why does this not affect NVidia, Amazon, Apple, or TSMC?
bayindirh 6 hours ago [-]
A friend who developed a game engine from scratch and is familiar with inner workings and behavior of NVIDIA driver calls it an absolute circus of a driver.
Also, their latest consumer card launches are less then stellar, and the tricks they use to pump up performance numbers are borderline fraud.
As Gamers Nexus puts it "Fake prices for fake frames".
varelse 2 hours ago [-]
[dead]
ip26 10 hours ago [-]
The affliction he’s imputing is born of absolute dominance over decades. Apple has never had the same level of dominance, and NVidia has only had it for two or three years.
It could possibly come to haunt NVidia or TSMC in decades to come.
tharkun__ 14 hours ago [-]
See the funny thing is, even with all of this stuff about Intel that I hear about (and agree with as reported), I also just committed a cardinal sin just recently.
I'm old, i.e. "never buy ATI" is something that I've stuck to since the very early Nvidia days. I.e. switched from Matrox and Voodoo to Nvidia while commiserating and witnessing friend's and colleagues ATI woes for years.
The high end gaming days are long gone, even had a time of laptops where 3D graphics was of no concern whatsoever. I happened to have Intel chips and integrated graphics. Could even start up some gaming I missed out on during the years or replay old favourites just fine as even a business laptop Intel integrated graphics chip was fine for it.
And then I bought an AMD based laptop with integrated Radeon graphics because of all that negative stuff you hear about Intel and AMD itself is fine, sometimes even better, so I thought it was fair to give it a try.
Oh my was that a mistake. AMD Radeon graphics is still the old ATI in full blown problem glory. I guess it's going to be another 25 years until I might make that mistake again.
accrual 14 hours ago [-]
It's a bummer you've had poor experiences with ATI and later AMD, especially on a new system. I have an AMD laptop with Ryzen 7 7840U which includes a Radeon 780M for integrated graphics and it's been rock solid. I tested many old and new titles on it, albeit at medium-ish settings.
What kind of problems did you see on your laptop?
PaulHoule 11 hours ago [-]
Built a PC with a top-of-the line AMD CPU, it's great. AMD APUs are great in dedicated gaming devices like the XBOX ONE, PS 4 and 5 and Steam Deck.
On the other hand I still think of Intel Integrated GPU in "that thing that screws up your web browser chrome of if you have a laptop with dedicated graphics"
aleph_minus_one 14 hours ago [-]
Not tharkun__:
AMD basically stopped supporting (including updating drivers) for GPUs before RDNA (in particular GCN), while such GPUs were still part of AMD's Zen 3 APU offerings.
tharkun__ 13 hours ago [-]
Well back when, literally 25 years ago, when it was all ATI, there were constant driver issues with ATI. I think it's a pretty well known thing. At least was back when.
I did think that given ATI was bought out by AMD and AMD itself is fine it should be OK. AMD always was. I've had systems with AMD CPUs and Nvidia GPUs back when it was an actual desktop tower gaming system I was building/upgrading myself. Heck my basement server is still an AMD CPU system with zero issues whatsoever. Of course it's got zero graphics duties.
On the laptop side, for a time I'd buy something with discrete Nvidia cards when I was still gaming more actively. But then life happened, so graphics was no longer important and I do keep my systems for a long time / buy non-latest gen. So by chance I've been with Intel for a long time and gaming came up again, casually. The Intel HD graphics were of course totally inadequate for any "real" current gaming. But I found that replaying some old favs and even "newer" games I had missed out on (new as in, playing a 2013 game for the very first time in 2023 type thing) was totally fine on an Intel iGPU.
So when I was getting to newer titles, the Intel HD graphics no longer cut it but I'm still not a "gamer" again, I looked at a more recent system and thought I'd be totally fine trying an AMD system. Exactly like another poster said, "post 2015 should be fine, right?! And then there's all this recent bad news about Intel, this is the time to switch!".
Still iGPU. I'm not going to shell out thousands of dollars here.
And then I get the system and I get into Windows and ... everything just looks way too bright, washed out, hard to look at. I doctored around, installed the latest AMD Adrenalin driver, played around with brightness, contract, HDR, color balance, tried to disable the Vari-Brightness I read was supposed to be the culprit etc. It does get worse once you get into a game. Like you're in Windows and it's bearable. Then you start a game and you might Alt-Tab back to do something and everything is just awfully weirdly bright and it doesn't go away when you shut down the game either.
I stuck with it and kept doctoring for over 6 months now.
I've had enough. I bought a new laptop, two generations behind with an Intel Iris Xe for the same amount of money as the ATI system. I open Windows and ... everything is entirely totally 150% fine, no need to adjust anything. It's comfortable, colors are fine, brightness and contrast are fine. And the performance is entirely adequately the same as with the AMD system. Again, still iGPU and that's fine and expected. It's the quality I'm concerned with, not the performance I'm paying for. I expect to be able to get proper quality software and hardware even if I pay for less performance than gamer kid me back when was willing to.
wtallis 13 hours ago [-]
> And then I get the system and I get into Windows and ... everything just looks way too bright, washed out, hard to look at.
I've seen OEMs do that to an Intel+NVIDIA laptop, too. Whatever you imagine AMD's software incompetence to be, PC OEMs are worse.
tharkun__ 12 hours ago [-]
It's Lenovo. FWIW, one thing I really didn't like much either was that I found out that AMD really tries to hide what actual GPU is in there.
Everything just reports it as "with Radeon graphics", including benchmarking software, so it's almost impossible to find anything about it online.
The only thing I found helped was GPU-Z. Maybe it's just one of the known bad ones and everything else is fine and "I bought the one lemon from a prime steak company" but that doesn't change that my first experience with the lemon company turned prime steak company is ... another lemon ;)
It's a Lucienne C2 apparently. And again, performance wise, absolute exactly as I expected. Graphics quality and AMD software? Unfortunately exactly what I expected from ATI :(
And I'm not alone when I look online and what you find online is not just all Lenovo. So I do doubt it's that. All and I mean all my laptops I'm talking about here were Lenovos. Including when they were called IBM ThinkPads and just built by Lenovo ;)
dharmab 9 hours ago [-]
Laptops have really gone to hell in the past few years. IMO the only sane laptop choices remaining are Framework and Apple. Every other vendor is mess, especially when it comes to properly sleeping when closing the lid.
badc0ffee 11 hours ago [-]
I bought an AMD Ryzen Thinkpad late last year, and I had the same issue with bright/saturated colours. I fixed it by running X-Rite Color Assistant which was bundled with the laptop, and setting the profile to sRGB. I then turned up the brightness a little.
I think this a consequence of the laptop having HDR colour, and the vendor wanting to make it obvious. It's the blinding blue LED of the current day.
tharkun__ 10 hours ago [-]
Yeah, I read HDR might be the issue. Didn't know X-Rite and did not come with the laptop, but did play with disabling / trying to adjust HDR, making sure sRGB was set etc. Did not help. Also ran all the calibrations I could find for gamma, brightness and contrast many many times to try and find something that was better.
What I settled on for quite some time was manually adjusted color balance and contrast and turning the brightness down. That made it bearable but especially right next to another system, it's just "off" and still washed out.
If this was HDR and one can't get rid of it, then yeah agreed, it's just bad. I'm actually surprised you'd turn the brightness up. That was one of the worst things to do, to have the brightness too high. Felt like it was burning my eyes.
badc0ffee 10 hours ago [-]
I think I found X-Rite by just searching for color with the start menu.
Before I used that tool, I tried a few of the built-in colour profiles under the display settings, and that didn't help.
I had to turn the brightness up because when the display is in sRGB it gets dimmer. Everything is much more dim and muted, like a conventional laptop screen. But if I change it back to say, one of the DICOM profiles, then yeah, torch mode. (And if I turn the brightness down in that mode, bright colours are fine but dim colours are too dim and everything is still too saturated).
ecbc 12 hours ago [-]
[dead]
dismalaf 14 hours ago [-]
Did you time travel from 2015 or something? Haven't heard of anyone having AMD issues in a very long time...
happyPersonR 13 hours ago [-]
I’ve been consistently impressed with AMD for a while now. They’re constantly undervalued for no reason other than CUDA from what I can tell.
dismalaf 13 hours ago [-]
AMD is appropriately valued IMO, Intel is undervalued and Nvidia is wildly overvalued. We're hitting a wall with LLMs, Nvidia was at one point valued higher than Apple which is insane.
Also CUDA doesn't matter that much, Nvidia was powered by intense AGI FOMO but I think that frenzy is more or less done.
I've used Linux exclusively for 15 years so probably why my experience is so positive. Both Intel and AMD are pretty much flawless on Linux, drivers for both are in the kernel nowadays, AMD just wins slightly with their iGPUs.
pjmlp 8 hours ago [-]
Yet my AMD APU was never properly supported for hardware video decoding, and could only do up to OpenGL 3.3, while the Windows 10 driver could go up to OpenGL 4.1.
cycomanic 10 hours ago [-]
I wish I had an AMD card. Instead our work laptops are X1 extremes with discrete nvidia cards and they are absolutely infuriating. The external outputs are all routed through the nvidia card, so one frequently ends up with the fan blowing on full blast when plugged into a monitor. Moreover, when unplugging the laptop often fails to shutdown the discrete graphics card so suddenly the battery is empty (because the discrete card uses twice the power). The Intel card on the other hand seems to prevent S3 sleep when on battery, i.e. the laptop starts sleeping and immediately wakes up again (I chased it down to the Intel driver but couldn't get further).
And I'm not even talking about the hassle of the nvidia drivers on Linux (which admittedly has become quite a bit better).
All that just for some negligible graphics power that I'm never using on the laptop.
imtringued 14 hours ago [-]
Meanwhile PC gamers have no trouble using their AMD GPUs to play Windows games on Linux.
tharkun__ 12 hours ago [-]
That's actually something I have not tried at all again yet.
Back in the day, w/ AMD CPU and Nvidia GPU, I was gaming on Linux a lot. ATI was basically unusable on Linux while Nvidia (not with the nouveau driver of course), if you looked past the whole kernel driver controversy with GPL hardliners, was excellent quality and performance. It just worked and it performed.
I was playing World of Warcraft back in the mid 2000s via Wine on Linux and the experience was actually better than in Windows. And other titles like say Counter Strike 1.5, 1.6 and Q3 of course.
I have not tried that in a long time. I did hear exactly what you're saying here. Then again I heard the same about AMD buying ATI and things being OK now. My other reply(ies) elaborate on what exactly the experience has been if you're interested.
baq 9 hours ago [-]
Can’t say what your experience with your particular box will be, but the steam deck is absolutely fantastic.
vishnugupta 6 hours ago [-]
That’s not specific to Intel though. That’s how Directors and above are recruited in any big company.
For example, Uber hired a VP from Amazon. And the first thing he did was to hire most of his immediate reports at Amazon to Director/Senior Director positions at Uber.
At that level of management work gets done mostly through connections, favors and networking.
ethbr1 13 hours ago [-]
> it’s a regular occurrence for Intel to announce a bold new venture to try to claim some new territory, and just as regular that they announce they’re halting that venture in the name of “consolidating” and “focusing on their core.” [...] [Intel's new thing's] expected lifespan is shorter than the average Google product.
You got there in the end. You get the same outcome with the same corporate incentive.
Both Intel and Google prioritize {starting something new} over {growing an existing thing}, in terms of corporate promotions and rewards, and therefore employees and leaders self-optimize to produce the repeated behavior you see.
The way to fix this would be to decrease the rewards for starting a new thing and increase the rewards for evolving and growing an existing line of business.
throwaway2037 10 hours ago [-]
> Both Intel and Google prioritize {starting something new} over {growing an existing thing}, in terms of corporate promotions and rewards, and therefore employees and leaders self-optimize to produce the repeated behavior you see.
I cannot speak for Intel, but Google has done very well by "growing an existing thing" in AdWords and YouTube. Both account for the lion's share of profits. They are absolute revenue giants. Many have tried, and failed to chip away at that lead, but Google has managed to adapt over and over again.
4gotunameagain 8 hours ago [-]
It is the only two things that google has regularly maintained, one of which with one of the biggest moats (youtube, the to go video service), and the other connected to the homepage of the internet.
It's really hard to fuck these things up. Which they have been trying hard, given the state of youtube and the search engine.
blitzar 7 hours ago [-]
rewards for not fucking up an existing (monopoly) line of business
I can see why you have to be "special" to work at these places.
dylan604 11 hours ago [-]
It's similar to sales vs dev in software. Sales are always prioritizing new features to attract new users instead of fixing the known issues that are pissing off your current users.
New feature attracts new users and allows for fancy press releases. Nobody cares about press releases about an existing product getting a bug fix are become more stable.
Our society is nothing but "ooh look, shiny!" type of short attention span
smallmancontrov 15 hours ago [-]
M&A churn is a way for management to monetize their power. Efficacy is a distant second concern.
kccqzy 15 hours ago [-]
How does management benefit from M&As? Sorry if this is a basic question. Do executives get paid based on the number of acquisitions?
stackskipton 14 hours ago [-]
Two ways:
Bonuses by juicing revenue numbers
Bigger next job by doing M&A and having really good-looking resume and interview story.
> Additionally, managers may prefer mergers because empirical evidence suggests that the size of a company and the compensation of managers are correlated.
Yeah, that's where my mind went. Executive and upper management salaries seem to be a function of revenue, not profit.
atq2119 12 hours ago [-]
A lot of compensation works that way, to be honest. First order is that you get a percentage of whatever river of money you sit close to, regardless of effort or skill.
stackskipton 12 hours ago [-]
Esp when you are talking about software. Revenue means you have a customer that is locked up. Once you are ready to get profit, reduce costs/jack up prices and profit comes rolling in.
nine_k 14 hours ago [-]
If I hold stock in a company, then my company acquires that company, the stock rises, and I liquidate my position in it after 6 months or whatever the cool-down period is, is this considered insider trading?
JoshTriplett 14 hours ago [-]
If you hold stock in company A, and your current company B acquires company A, that's not insider trading if you already owned the stock in company A before you had any information that company B was going to make that decision.
It is, however, a conflict of interest for you to be involved in company B's acquisition of company A (e.g. influencing company B to buy company A), and might even rise to the level of a breach of your fiduciary duty to company B.
3eb7988a1663 13 hours ago [-]
I know a woman who was part of a M&A team. On her first day, she was told her days of owning individual stocks in the industry were over. She could only purchase aggregate funds. Although, I do wonder if the same rules apply to the VPs who actually have to sign off on the deals.
wrs 14 hours ago [-]
Insider trading is all about information held by "insiders", not about who owns what. So it would depend on whether you know something material and nonpublic when you liquidate your position (e.g., you know the acquisition is going terribly and the acquiring company is going to write it off).
nine_k 15 hours ago [-]
But, well, it was a ten-year bet: Altera was acquired in 2015.
If they could not figure how to make it profitable, maybe somebody else should try. (Of course I don't think that the PE company is going to do just that.)
wtallis 14 hours ago [-]
It was a ten-year bet, but they spent the first several years actively sabotaging Altera by trying to move their whole product stack over to non-functional Intel fabs.
iaresee 14 hours ago [-]
...and the majority of their internal development systems they used for all their chip design and layout.
dylan604 11 hours ago [-]
Doesn't purchase by a PE company pretty much guarantee the death of it? At least the selling off of the most profitable parts and pieces? Has there ever been a story of a PE purchase and the company grew under the new owner?
robert2020 11 hours ago [-]
PE’s buy companies to increase the company’s value then sell it. There’s been many successes. Powerschool, Hilton, Dunkin’ Brands, Dollar General, Beats by Dre, Petco, GoDaddy, BJ’s Wholesale Club, Neiman Marcus, Panera Bread, Allegro, Guitar Center, Nielsen, McAfee…
no_wizard 10 hours ago [-]
Most of these have very serious issues, especially with regards to labor violations and general treatment of employees.
gonzo 9 hours ago [-]
Silver Lake took Dell private.
jrockway 15 hours ago [-]
This seems to be common for corporate America in general. I used to work at a YC startup. We kiiiiiinda maaaaaaaybe ran out of money (not my department) and happened to get bought by a large investor that also happens to be a US-based hardware manufacturer. Two years and countless reorgs later, they laid everyone off and as far as I know, are no longer in the business of selling the software products they bought. They never figured out how software worked, never had anyone managining the division for more than 6 months, and got bored. I think they thought by moving everyone over to Microsoft Word and Windows laptops (peppered with a half-hearted threat about RTO), they would just magically make billions of dollars the first month. It didn't happen.
I am beginning to think M&A are just some sort of ego thing for bored megacorp execs, rather than serious attempts to add efficiency and value to the marketplace. (Prove me wrong, bored megacorp execs. I'll wait.)
aurumque 15 hours ago [-]
Having been through a few acquisitions myself, I think there is a perverse incentive where buying and destroying any competition (real or imagined) leads to positive enough outcomes that it doesn't matter if the underlying asset is destroyed. Nobody would come out and say that, but when an acquisition is tossed aside there may not be enough repercussions to prevent it from happening again.
tinco 13 hours ago [-]
Intel bought a drone company that was producing the only drone that was good enough for my real estate inspection company to use. They acquired it and then killed it a year or two after. The inspection industry didn't have a proper drone for years after that until DJI started getting serious about it and produced the M30E.
It was just senseless, Intel doesn't have real or imagined competition from a drone company, it wasn't even close to being in the same market. They just believed the hype about drones being the next big thing and when they found out they were too early they decided they didn't have the patience to wait for drones to become a thing and they killed it. There was no long term vision behind it.
zvr 9 hours ago [-]
The high-end Falcon models were an engineering marvel and, as you say, nothing else in the market was even close.
I don't know about "real estate inspection", but another use case was for them to be used in oil rigs in the North Sea to inspect the structure of the rig itself. They had to be self-stabilizing under high winds and adverse weather conditions, and they had to carry a good enough camera to take detailed photos.
Unfortunately, while the technology was there, the market wasn't. Not many wanted to get a $35K drone to be able to sustain this business.
throwaway2037 10 hours ago [-]
Wow, this post is really specific. What special hardware is required on a drone for "real estate inspection"?
tinco 2 hours ago [-]
The company is still thriving, you can check out their website to find out more about what real estate inspection is about (in The Netherlands): https://www.aeroscan.nl/
andyferris 10 hours ago [-]
I believe it could be the weight of the camera and lens you would desire for good looking photos (think Sony a7 size). Good looking photos sell houses.
EDIT I just noticed the “inspection” part. Maybe they wanted good zoom to spy on the tenants? (Or maybe that’s a really uncharitable take).
random_savv 8 hours ago [-]
To me, high quality photos for real estate inspection means (e.g.) being able to take high resolution photos of a specific part of the roof so you can understand why there’s a leak. Not having to climb is a big deal!
timewizard 15 hours ago [-]
This is one of the main reasons we added anti monopoly provisions to our laws more than 100 years ago. Market dominance is a recognized factor in allowing this inversion of rewards to occur.
That's the face of it. Labor is a market as well. The impacts of these arrangements on our labor pool is extraordinary. It's a massive displaced cost of allowing these types of mergers to occur born out by the people who stand to gain the least from the merging of business assets.
ethbr1 13 hours ago [-]
> I am beginning to think M&A are just some sort of ego thing for bored megacorp execs
It seems like a low risk effort to put a promising inexperienced exec in charge of a recent acquisition.
If they're a screw up and run it into the ground, imagine how much damage they could have done in a megacorp position of power.
Megacorp saved (at the cost of a smaller company)
WeylandYutani 13 hours ago [-]
Is Intel still a mega corporation? That seems to be the real problem for Intel. Becoming prey.
apercu 26 minutes ago [-]
It could just be a stock play.. Need the stock to move up? Buy a company.
Stock down again? Sell the company you bought 2 years ago.
From the top to the bottom the problem with late stage capitalism is misaligned incentives.
Edit: I wrote "the problem" and I should have written "among the many, many problems"
wmf 15 hours ago [-]
And Intel's acquisitions kill off promising startups. At least Altera is being sort of spun off instead of outright destroyed.
fredoralive 7 hours ago [-]
My personal theory is that desktop / laptop / server x86 (usually) is such a giant money printer that a) Intel can invest in anything (Altera, antivirus, Optane...) but b) when they do, they quickly realise that this isn't a giant profit margin machine like x86, so why bother?
bigfatkitten 6 hours ago [-]
They fuck their customers when they do that. A good friend of mine had a product designed around Quark that was about to go into production when Intel pulled the rug out from under him.
wombatpm 12 hours ago [-]
I worked for a former Fortune 300 company that had an active internal investment strategy. They wanted the next billion dollar business, guaranteed, in 12 months. And wanted to invest more than 1 million dollars. Sadly they are now bankrupt and owned by PE.
rqtwteye 14 hours ago [-]
Seems they should read Andy Grove’s books.
evertedsphere 11 hours ago [-]
> a wild see-sawing between an aggressive and a defensive market posture
tick, tock
thot_experiment 16 hours ago [-]
Rest in Peace Altera I guess? I still drink out of my color changing Altera mug (that's long stopped changing color) most days. PE ruins everything so it's only a matter of time before they're gutted and sold for scraps by the vultures at Silver Lake. (though honestly the writing was on the wall since the Intel acquisition I had held onto some hope) If only we had a functioning government interested in actually maintaining our technological dominance and enforcing/expanding antitrust legislation. I wrote my first Verilog on an Altera chip and I'll remember them fondly.
neilv 14 hours ago [-]
> [...] my color changing Altera mug (that's long stopped changing color) most days. PE ruins everything [...]
I don't think PE is responsible for that one.
thot_experiment 13 hours ago [-]
I think your attention heads might not be reading the positional encoding quite right. :P
greenavocado 16 hours ago [-]
[flagged]
jcranmer 12 hours ago [-]
Tariffs done well could be a boon for a sector. But they are tricky to do well, and the current administration doesn't show the slightest hint of being capable of doing it well.
You have to treat tariffs not as a moat to protect an industry for good, but a runway to give a nascent industry enough room to take off. In a mature industry, tariffs are more likely to keep incumbents uncompetitive and disincentivize investments that would make them more competitive, especially if those are capital-heavy.
Also, tariffs aren't going to be effective if other structural issues exist in an industry that prevent or sharply limit expansion. Like key components having a sole worldwide supplier with a full order book. Or if capital investment to set up a new factory are beyond the ability of the financial markets to provide.
kristjansson 14 hours ago [-]
Yes a competent and forward-looking trade and industrial policy would be nice, as a treat.
9283409232 15 hours ago [-]
I don't think people are opposed to tariffs, at least they weren't before. Bernie Sanders has historically been for tariffs when used properly. Used properly being the important phrase. When you have someone who doesn't understand what a trade deficit is imposing tariffs based on the difference between deficit and surplus, you pretty effectively turn people against tariffs on top of the whole destroying the global economy thing.
dingnuts 15 hours ago [-]
No, the issue is not that Trump doesn't have the appropriate understanding. The issue is that they are illegal tariffs.
The President does not have the power to create tariffs. Congress does!
The reason the President doesn't have this power is because the economy should not rest on the whims or understanding of any one person.
cwillu 13 hours ago [-]
Don't forget tho part where the tariffs were imposed in violation of trade agreements and treaties the US had already negotiated and agreed to.
kimbernator 2 hours ago [-]
Congress delegated tariff power to the president after Smoot-Hawley caused such a disaster.
> The reason the President doesn't have this power is because the economy should not rest on the whims or understanding of any one person.
It's interesting to see that it just takes time for lessons to be un-learned. The reason Smoot-Hawley was such a disaster is that it took hundreds of people to agree that it was good policy in the house, which meant adding tariffs to the bill in favor of the districts they individually represented. The result was an egregiously long list of things being tariffed. They delegated it to the one person specifically because they weren't similarly beholden to so many conflicting pressures.
I don't mean any of this to defend Trump's actions, in fact the opposite: he's essentially managing to do the same thing even without politic pressures to do so. I just mean to say that it is reasonably sane for congress to have delegated tariffs in a limited capacity when this flaw was revealed.
ethbr1 13 hours ago [-]
In this case it's not Trump but Peter Navarro [0] who doesn't understand how tariffs work, because he's apparently never looked into multiparty game theory.
Exhibit A: Navarro being sidelined and Scott Bessent put in charge of running tariff negotiations, after the bond markets spooked.
No it's Trump. If it wasn't made clear to you during his first term that he doesn't understand tariffs or trade deficits, he spent most of 2023 and 2024 campaigning and showing you he doesn't understand it.
ethbr1 3 hours ago [-]
Trump doesn't understand most things he does, hence why his advisors have so much power over outcomes, and policies shift as different advisors fall in and out of favor.
9283409232 14 hours ago [-]
I mean there are several issues at play here. He is being sued for the illegal tariffs since Republicans are spineless and are cool with him just doing anything so I'm focusing on the practical problems.
knowaveragejoe 15 hours ago [-]
This comment is textbook Poe's Law.
15 hours ago [-]
hiddencost 15 hours ago [-]
[flagged]
greenavocado 15 hours ago [-]
There is no need to resort to personal attacks
d-moon 15 hours ago [-]
As someone who's worked at Xilinx before and after the merger, it's a surprise they were even able to sell it for that much. Altera has been noncompetitive to Xilinx in performance and to Lattice in terms of low-end/low-power offerings for at least the last 2 generations.
I'm concerned about the future of FPGAs and wonder who will lead the way to fix these abhorrent toolchains these FPGA companies force upon developers.
HelloNurse 6 hours ago [-]
So Intel found optimists who think they can make Altera more competitive? It's a success. Success with Intel products would be better, and excellence at M&A is hard to convert into excellence at chipmaking, but it's better than nothing.
gscott 15 hours ago [-]
It seems FPGA can now do things for LLM's so there might be some future in that
I hear this a lot, but in my experience this isn't true at all.
A Versal AI Edge FPGA has a theoretical performance of 0.7TFLOPs just from the DSPs alone, while consuming less power than a Raspberry Pi 5 and this is ignoring the AI Engines, which are exactly the ASICs that you are talking about. They are more power efficient than GPUs, because they don't need to pretend to run multiple threads each with their own register files or hide memory latency by swapping warps. Their 2D NOC plus cascaded connections allow them to have a really high internal memory bandwidth in-between the tiles at low power.
What they are missing is processing in memory, specifically LPDDR-PIM for GEMV acceleration. The memory controllers simply can't deliver a memory bandwidth that is competitive with what Nvidia has and I'm talking about boards like Jetson Orin here.
wmf 9 hours ago [-]
Now compare that FPGA to Groq, SambaNova, or Cerebras. The ASICs are more efficient.
threatripper 8 hours ago [-]
FPGAs are neither here nor there and will always be niche. If you need the same thing many times you make dedicated silicon. If you need many different things available all at once you use a normal CPU. Only if ASICs are too expensive and CPUs are too slow the FPGA can shine.
teleforce 14 hours ago [-]
If LLM can leverage on the new efficient attention mechanism based the FFT architecture discovered by Google then FPGA can be the new hot stuff [1]:
[1] The FFT Strikes Back: An Efficient Alternative to Self-Attention (168 comments):
You worked at Xilinx and you're not aware that FPGA is not a growing segment?
throwaway2037 10 hours ago [-]
> FPGA is not a growing segment
What is replacing it? Single board computers? Or are APUs from ARM "good enough" and "cheap enough" now to replace FPGA?
almostgotcaught 10 hours ago [-]
There is literally no market for FPGA as coprocessor/accelerator and there never was (that was some kind of pipe/hype dream before GPGPU took off). Where there is a market for them (prototyping ASICs, automotive, whatever, network switches, etc) there is no replacement but there is also no growth.
Thimothy 6 hours ago [-]
Depends entirely how you define "growth". If you take AI, LLM as your baseline of growth, then yeah, sure. But what else is growing?
FPGAs are getting cheaper with each gen, expanding into low cost, high volume markets that were unthinkable for an FPGA 10 years ago. Lattice has an FPGA family specifically targeted to smartphones, and I've been consulting for a high end audio company that wanted to do some dsp, and a cheap FPGA was the best option in the market for the particular implementation that they wanted to do.
It's not sexy growth, but it's growth. Otherwise, we wouldn't had the explosion of the latest years in low end FPGA companies.
Calwestjobs 3 hours ago [-]
you can do high end audio dsp on dspic lol.
lookup sigmastudio dsp, dsp is insanely cheap todo, there is absolutely no need for fpga, what that guy was doing was either nonsense or it was in 1995. which are both irrelevant points, or rather you provided examples that show fpga are irrelevant, no growth market.
(how many audio devices were using TMS320 dsps even before and after ipod was a thing...)
Thimothy 2 hours ago [-]
My point is that FPGAs have become very cheap, competing with microcontrollers. I would agree that high end audio manufacturers are about as rational as they costumers.
If FPGAs are not a growing market, how come we have gone from 2 companies (we'll ignore niche space stuff) to ~10 in the last 20 years? Not many IC fields where there is a growth in manufacturers instead of consolidation...
imtringued 14 hours ago [-]
Yeah I personally wondered if AMD was just copying Intel, because apparently every CPU manufacturer also needs to manufacture FPGAs, or they actually have a long term strategy where it is essential for both the FPGA and CPU departments to cooperate.
I think Xilinx did a fine job with their AI Engines and AMD decided to integrate a machine learning focused variant on their laptops as a result. The design of the intel NPU is nowhere near as good as AMD's. I have to say that AMD is not a software company though and while the hardware is interesting, their software support is nonexistent.
Also, if you're worried about FPGAs that doesn't really make much sense, since Effinix is killing it.
opello 12 hours ago [-]
I briefly hoped that, like the integration of GPUs, there would be a broader integration of programmable logic in general purpose CPUs, with AMD integrating Xilinx fabric and Intel integrating Altera fabric. But I could never imagine a real use case and apparently there wasn't a marketable enough one either. Something like high-level synthesis ending up like CUDA always seemed like it would present a neat development environment for certain optimizations.
nickpsecurity 10 hours ago [-]
I wanted that, too. Then, integrated with something like Synflow:
Alteras tools seemed more civilized than Xilinx's, in my limited experience.
svnt 16 hours ago [-]
For those keeping score at home, 51% sold at a total valuation of $8.75B, which means they are bringing in around $4.5B, and recognizing a loss of roughly 50% on what was their biggest deal ever when it took place in 2015.
Jach 15 hours ago [-]
"In December 2015, Intel acquired Altera for $16.7 billion in cash."
$21.5 bn inflation adjusted. Amazing ten year performance.
addaon 15 hours ago [-]
Sure, it was down 60%. But the real question is whether it outperformed Intel as a whole, and outperformed other internal investments Intel could make. I certainly wouldn't think that a 2015 dollar anywhere else within Intel is worth more than 40¢ today, given how they've been running.
janalsncm 15 hours ago [-]
Just to put numbers to it, Intel was $34 in 2016. It’s $20 now. So a dollar in Intel in 2016 would be worth 59 cents today.
ETA they also paid out almost $10 in dividends.
Tuna-Fish 15 hours ago [-]
You are ignoring dividends. They have paid something like ~$9 of dividend per share since 2016.
janalsncm 14 hours ago [-]
My mistake. Like $9.86 in dividends.
scottyah 15 hours ago [-]
Or they got what they wanted from it and are selling off the rest, like when Google bought Motorola Wireless for the Patents then sold off the non-googly employees, culture, and brand for cheap.
blitzar 7 hours ago [-]
> sold off the non-googly employees
Ouch - your work is so good we will pay 10x what it is worth, because we are not good enough to do it.
But you are not good enough for us. Maybe they couldn't a binary tree.
KoolKat23 10 minutes ago [-]
Looks like Intel is being stripped for parts.
Jach 15 hours ago [-]
Man I remember being excited when Intel bought Altera, maybe they'd bring FPGAs to the masses, then they proceeded to do nothing with them...
jeffparsons 14 hours ago [-]
I was excited, too. I was also excited when Intel announced Larrabee.
That was before I learnt about the many and varied ways in which Intel sabotages itself, and released that Intel's underperformance has little to do with a lack of good technical ideas or talent.
I.e. I was young and naive. I am now considerably less young, and at least a little less naive.
bigfatkitten 16 hours ago [-]
It was a silly acquisition in the first place, and their justification clearly came from a coke-addled fever dream.
Intel soon discovered the obvious, which is that customers with applications well-suited to FPGAs already use FPGAs.
danielmarkbruce 16 hours ago [-]
There was some hope at the time that FPGAs could be used in a lot more applications in the data center. It is likely still feasible. Remember Hennessy published:
And maybe this is/was a pipe dream - maybe there aren't enough people with the skills to have a "golden age of architecture". But MSFT was deploying FPGAs in the data center and there were certainly hopes and dreams this would become a big thing.
bigfatkitten 15 hours ago [-]
That was certainly the dream, but unfortunately for them it didn't turn out to be a new market.
danielmarkbruce 15 hours ago [-]
I don't know enough about hardware to know why - why didn't this story play out as hoped?
tux3 15 hours ago [-]
FPGA dev is just much more painful and more expensive than software dev at every step.
That's in no small part because the industry & tools seem to be stuck decades in the past. They never had their "GCC moment". But there's also inherent complexity in working at a very low level, having to pay attention to all sorts of details all the time that can't easily be abstracted away.
There's the added constraint that FPGA code is also not portable without a lot of extra effort. You have to pick some specific FPGA you want to target, and it can be highly non-trivial to port it to a different one.
And if you do go through all that trouble, you find out that running your code on a cloud FPGAs turns out to be pretty damn expensive.
So in terms of perf per dollar invested, adding SIMD to your hot loop, or using a GPU as an accelerator may have a lower ceiling, but it's much much more bang for the buck and involves a whole lot less pain along the way.
PaulHoule 14 hours ago [-]
It's hard to find places where FPGAs really win. For relatively simple tasks FPGA can beat just about anything in latency. For instance for the serialization/deserialization end of a high frequency trading system. If a problem has a large working set and needs to store data in DRAM it needs a memory controller the same way a CPU or GPU has a memory controller and this can only be efficient if the system's memory access pattern is predictable.
You can certainly pencil out FPGA or ASIC systems that which would attain high levels of efficient parallelism if there wasn't memory bandwidth or latency limits but there are. If you want to do math that GPUs are good at, you use GPUs. Historically some FPGAs have let you allocate bits in smaller slices so if you only need 6 bit math you can have 6 bit math but GPUs are muscling in on that for AI applications.
FPGAs really are good at bitwise operations used in cryptography. They beat CPUs at code cracking and bitcoin mining but in turn they get beat by ASICs. However there is some number of units (say N=10,000) where the economics of the ASIC plus the higher performance will drive you to ASIC -- for Bitcoin mining or for the NSA's codebreaking cluster. You might prototype this system on an FPGA before you get masks made for an ASIC though.
For something like the F-35 where you have N=1000 or so, could care less about costs, and might need to reconfigure it for tomorrow's threats, the FPGA looks good.
One strange low N case is that of display controllers for retrocomputers. Like it or not a display controller has one heck of a parts count to make out of discrete parts and ASIC display controllers were key to the third generation of home computers which were made with N=100,000 or so. Things like
and are already expensive compared to the Raspberry Pi so they tend to use either a microcontroller or FPGA, the microcontroller tends to win because an ESP32 which costs a few buck is, amazingly, fast enough to drive a A/D converter at VGA rates or push enough bits for HDMI!
rasz 12 hours ago [-]
>It's hard to find places where FPGAs really win.
Rapid product development. Got a project that needs to ship in 6-9 months and will be on the market for less than two years in small volume? Thats where FPGAs go. Medical, test and measurement, military, video effects, telepresence, etc.
varjag 5 hours ago [-]
Sure but only a tiny fraction of products in these markets require the performance calling for an FPGA.
rwmj 2 hours ago [-]
I'm not sure about that. In these fields there are plenty of places where you need to ingest or process masses of data (eg. from a sensor in a medical device), and you're only going to sell 5 of these machines a month for 100K each, so $3000+ bill of materials for an FPGA to solve the problem makes sense.
The problem (for Intel) is that you don't sell billions of dollars of FPGAs into a mass market this way.
jcranmer 12 hours ago [-]
Most of my knowledge about FPGAs come from ex-FPGA people, so take this with a grain of salt:
First off, clock rates on an FPGA run at about a tenth that of CPUs, which means you need a 10× parallelism speedup just to break-even, which can be a pretty tall order, even for a lot of embarrassingly parallel problems.
(This one is probably a little bit garbled) My understanding is that the design of FPGAs is such that they're intrinsically worse at getting you FLOP/memory bandwidth number than other designs, which also gives you a cap on expected perf boosts.
The programming model is also famously bad. FPGAs are notorious for taking forever to compile--and the end result of waiting half an hour or more might simply be "oops, your kernel is too large." Also, to a degree, a lot of the benefits of FPGA are in being able to, say, do a 4-bit computation instead of having to waste the logic on a full 8-bits, which means your code also needs to be tailored quite heavily for an FPGA, which makes it less accessible for most programmers.
bjourne 15 hours ago [-]
Tooling mostly. To write fast code for CPUs you need a good optimizing compiler, like clang or gcc. Imagine how much work has gone into making them good. We're talking thousands of man years over several decades. You need just as good tooling for FPGAs and it takes just as much effort to produce. Except the market is orders of magnitudes smaller. You can also not "get help" from the open source community since high-end FPGAs are way to expensive for most hackers.
Intel tried to get around this problem by having a common framework. So one compiler (based on clang) with multiple backends for their CPUs, FPGAs, and GPUs. But in practice it doesn't work. The architectures are too different.
timschmidt 15 hours ago [-]
There is nothing quite like gcc or LLVM for FPGAs yet. FPGA tooling is still stuck in the world of proprietary compilers and closed software stacks. It makes the whole segment of the industry move slower and have higher friction. This is just starting to break with Yosys and related tools, which are showing wild advantages in efficiency over some of the proprietary tooling, but still only support a fraction of available chips, mostly the smaller ones.
smj-edison 15 hours ago [-]
I'm just a casual observer, but I'm pretty sure one hard thing about FPGAs is preventing abuse. A customer could easily set up a ring oscillator that burns out all the LUTs. Another thing is FPGAs are about 10x slower than dedicated logic, so CPUs/GPUs beat them for a lot of applications. Plus, there's not a lot of logic designers in the first place. Software skills don't transfer over very well. For example, a multiplier is about the size of 8kb of RAM, so lookups and complex flow are way more expensive than just multiplying a value again (kinda like GPUs, except if you only had an L1 cache without main memory).
smj-edison 13 hours ago [-]
Not sure why I'm being downvoted, would those who downvoted me explain why? I try to be accurate so if I missed any important details I'd like to know :)
fecal_henge 7 hours ago [-]
I dont know how a Ring Osc specifically could burn out LUTs. All switching contributes to the device temperature. If they get too hot then they will enter thermal protection mode.
We run such oscillators as dummy payloads for thermal tests while we are waiting for the real firmware to be written.
timewizard 14 hours ago [-]
CPU branch prediction got exponentially better. Better than most could have imagined.
matt3210 15 hours ago [-]
It made their stock pop for a while which was all that mattered to Brian Krzanich who took the bonus and left the mess in the hands of Bob Swan who did the same things and left the mess ... (recursion her).
nativeit 15 hours ago [-]
> Intel soon discovered the obvious, which is that customers with applications well-suited to FPGAs already use FPGAs.
So selling FPGA's was a bad move? Or was the purchase price just wildly out-of-line with the--checking...$9.8B annual market that's expected to rise to $23.3B by 2030?
bcrl 14 hours ago [-]
Intel can't even act as a functional foundry for internal customers.
komadori 16 hours ago [-]
Do you think AMD's decision to buy Xilinx was any better or not?
Alupis 16 hours ago [-]
Perhaps we can say it was less of a distraction for AMD, given AMD is not having the basic execution issue that Intel is currently suffering.
rcxdude 15 hours ago [-]
And less disastrous for Xilinx, given they could basically just keep going as they were before, instead of being significantly diverted onto a sinking ship of a process.
georgeburdell 16 hours ago [-]
If AMD did the same thing years later, was it really that foolish?
stonogo 15 hours ago [-]
Yes, because AMD is fabless, and Xilinx didn't suddenly have to figure out how to work around Intel's production problems.
mschuster91 16 hours ago [-]
> Intel soon discovered the obvious, which is that customers with applications well-suited to FPGAs already use FPGAs.
Yes, but pairing an FPGA somewhat tightly integrated with an actually powerful x86 CPU would have made an interesting alternative to the usual FPGA+some low end ARM combo that's common these days.
michaelt 15 hours ago [-]
Sure, if they wanted to intel could have done what nvidia did with CUDA: Put the tech into everything, even their lowest end consumer devices, and sink hundreds of millions into tooling and developer education given away free of charge.
And maybe it would have lead somewhere. Perhaps. But they didn't.
danielmarkbruce 15 hours ago [-]
It was the thought at the time that they'd do this. It's amazing that they don't seem to have actually tried ? Any sense as to why or what went wrong?
michaelt 15 hours ago [-]
I wasn't there, but I've always imagined the conversation went something like this:
Intel: Welcome, Altera. We'd like you to integrate your FPGA fabric onto our CPUs.
Altera: Sure thing, boss! Loads of our FPGAs get plugged into PCIe slots, or have hard or soft CPU cores, so we know what we're doing.
Intel: Great! Oh, by the way, we'll need the ability to run multiple FPGA 'programs' independently, at the same time.
Altera: Ummmm
Intel: The programs might belong to different users, they'll need an impenetrable security barrier between them. It needs full OS integration, so multi-user systems can let different users FPGA at the same time. Windows and Linux, naturally. And virtual machine support too, otherwise how will cloud vendors be able to use it?
Altera: Uh
Intel: We'll need run-time scaling, so large chips get fully utilised, but smaller chips still work. And it'll need to be dynamic, so a user can go from using the whole chip for one program to sharing it between two.
Intel: And of course indefinite backwards compatibility, that's the x86 promise. Don't do anything you can't support for at least 20 years.
Intel: Your toolchain must support protecting licensed IP blocks, but also be 100% acceptable to the open source community.
Intel: Also your current toolchain kinda sucks. It needs to be much easier to use. And stop charging for it.
Intel: You'll need a college outreach program. And a Coursera course. Of course students might not have our hardware, so we'll need a cloud offering of some sort, so they can actually do the exercises in the course.
Altera: I guess to start with we
Intel: Are you profitable yet? Why aren't you contributing to our bottom line?
jcranmer 12 hours ago [-]
I think they have tried to improve the software for FPGAs--FPGA backends are part of their oneAPI software stack, for example. And when I was in grad school, Intel was definitely doing courses on building for FPGAs using OpenCL (I remember seeing some of their materials, but I don't know much about them otehr than they existed).
As to why it didn't work, well, I'm not plugged into this space to have a high degree of certainty, but my best guess is "FPGAs just aren't that useful for that many things."
marcosdumay 15 hours ago [-]
Yes, if they actually made the thing available, maybe people would have used it for something. There were several proofs of concept at the time, with some serious gains, ever for the uses that people ended up using CUDA.
But they didn't actually sell it. At least not in any form anybody could buy. So, yeah, we get the OP claiming it was an obvious technological dead-end.
And if they included it on lower-end chips (the ones they sold just a few years after they brought Altera), we could have basically what the RasPI 2040 is nowadays. Just a decade earlier and controlled by them... On a second thought, maybe this was for the best.
bigfatkitten 16 hours ago [-]
Applications that benefit from the Zynq-style combination (e.g. radio systems) generally take that approach because they have SWaP concerns that preclude the use of big x86 CPUs in the first place.
snihalani 16 hours ago [-]
What's a SWaP concern?
201984 16 hours ago [-]
Size, Weight, and Power. It would be very nice if people would take the two seconds to type those words out instead of using ungoogleable acronyms on a public forum unfamilar with the terms.
superkuh 16 hours ago [-]
Size Weight and Power.
snihalani 14 hours ago [-]
I'd nominate another P for Profit.
dtquad 14 hours ago [-]
GPGPUs ended up becoming the AI/cloud accelerators that FPGAs promised to be back when Intel bought Altera.
FPGAs are not ideal for raw parallel number crunching like in AI/LLMs. They are more appropriate for predictable real-time/ultra-low-latency parallel things like the the modulation and demodulation of signals in 5G base state stations.
AlotOfReading 13 hours ago [-]
FPGAs might not be ideal, but AMD's NPU IP originated with Xilinx.
Intel was an early player to so many massive industries (e.g. XScale, GPGPU, hybrid FPGA SoCs). Intel abandoned all of them prematurely and has been left playing catch-up every time. We might be having a very different discussion if literally any of them had succeeded.
coredog64 13 hours ago [-]
XScale was forced on Intel as a penalty for anticompetitive activities against Digital. It’s no surprise then that they weren’t interested in doing anything with it.
ACAVJW4H 15 hours ago [-]
Quick search shows Altera held 30% of the FPGA market. That puts AMD’s $50B acquisition of Xilinx (which holds ~50% of the market) in an awkward light. Using some extremely crude math, Xilinx’s fair market value might now be closer to ~$15B.
Did AMD massively overpay, or has the FPGA market fundamentally shifted? Curious to see how this new benchmark ripples into AMD’s stock valuation.
fuzzythinker 14 hours ago [-]
Not all market share are equal, like iphone vs. android. Also, the value for the leader will cost more than the second in line.
timewizard 14 hours ago [-]
The FPGA market shifted. For a brief moment they were allowed to be on BOMs of end user devices due to the rest of the computing field lagging behind somewhat. That period, as far as I can tell, is over.
My anecdotal example would be high end broadcast audio processors. These do quite a bit beyond the actual processing of audio, in particular, into baseband or even RF signal generation.
In any case these devices used to be fully analog, then when they first went digital were a combination of DSPs for processing and FPGAs for signal output. Later generations dropped the DSP and did everything in larger FPGAs as the larger FPGAs became available. Later generations dropped the whole stack and just run on an 8 core Intel processor using real time linux and some specialized real time signal processing software with custom designed signal generators.
The high core and high frequency CPUs became good enough and getting custom made chips became exceptionally cheap as well. FPGAs became rather pointless in this pipline.
The US military, for a time, had a next generation radio specification that specifically called for the use of FPGAs, as that would allow them to make manufacturer agnostic radios and custom software for them. That never panned out but it shows the peak use of FPGAs to manage the constraints of this time period.
11 hours ago [-]
mastax 16 hours ago [-]
Intel acquired Altera in December 2015 for $16.7 billion in cash.
nativeit 15 hours ago [-]
If only someone could have come up with a plausibly profitable use-case for advanced FPGA's and highly performant, efficient, real-time processing or hardware acceleration in those intervening years? What are ya gonna do?
xadhominemx 15 hours ago [-]
Well the thesis was DC accelerators but the world went with ASSPs.
mmmBacon 15 hours ago [-]
When Intel acquired Altera, Altera’s market share was at 36% and Xilinx at 51%. Today Xilinx remains at ~50% while Altera’s share has dropped to 29%. Altera has lost share to Microchip and Lattice.
I’ve said it before, Intel is where technology companies go to die. Fortunately while Altera is probably a mess of useless Intel drone MBAs, there’s a decent core that can be salvaged. Best of luck to them.
rsp1984 16 hours ago [-]
Should change title. They sold 51% at a valuation of $8.75B, so cash in is ~ $4.29B.
voxadam 15 hours ago [-]
I've updated the title as best as I could within the constraints of the max length.
16 hours ago [-]
tim333 4 hours ago [-]
Summary of the situation from The Register headline:
>Intel flogs off majority stake in Altera to private equity for $4B
>Buy high, sell low: FPGA biz cost x86 giant $16B decade ago
for those not up on this stuff
bjourne 15 hours ago [-]
Apparently, the FPGA industry wasn't large enough for two major players. Maintaining an extremely specialized developer ecosystem for a relatively small niche can't have been cheap. Almost zero cross-over too, since FPGA tooling is much too foreign to be repurposed for other architectures. I suspect this move will make it a bit harder for Intel to collect "developer mindshare" for their other hyped up stuff because no one likes having the rug pulled out from under them. Hope AMD can make a better job with Xilinx than what Intel could with Altera.
rasz 15 hours ago [-]
Intel FPGA venture made tons more sense than AMD following it. FPGAs are great at filling up your idle fabs and honing engineering skills on reaching high yields.
Selling now also makes sense. There was only one serious competitor in 2015. Now you got Tariffs both ways to the main place where everything is build, and said place has own homegrown vendors like GOWIN, Sipeed, Efinix. But the biggest reason is amount of stuff designed in the West/Taiwan is falling with China taking over actual product design.
>In 2015, China released its “Made in China 2025” (MIC 2025) strategy, which refined some of these targets, setting a goal of achieving 40 percent self-sufficiency in semiconductors by 2020 and 70 percent by 2025.
>In 2024, the majority of MIC 2025's goals were considered to be achieved, despite U.S. efforts to curb the program.
Products coming out of China no longer use STM microcontrollers, Vishay/Analog mosfets/diodes and Altera/Xilinx FPGAs. Its all Chinese semiconductor brands you never heard about. Good example is this teardown of Deye SUN-5K-SG04LP1 5kW hybrid solar inverter https://www.youtube.com/watch?v=n0_cTg36A2Q
bjourne 7 minutes ago [-]
Intel was looking to sell Altera for over a year before Trump's Tariff Tourettes. And I bet it wasn't the hardware that was the problem, it was the software. No matter how amazing your FPGA hardware is, it is useless if you can't also produce high-quality software for operating it. For CPUs you can just tell people to use gcc or clang, not so with FPGAs.
TheMagicHorsey 14 hours ago [-]
I used to work at Intel (around 1999) in their Jones Farm campus in Oregon. My employee stock grants from that time are still underwater.
This was the heyday at Intel. I left within a year because I noticed that the talent that was respected, compensated and influential at Intel was the sales engineers. I can't pretend to have known that would lead to the decline of the company, but I knew that as an engineer uninterested in sales, that it wasn't the place for me.
ChrisGammell 13 hours ago [-]
I'd love to hear more about how the "sales engineers were the influential ones" manifested. I have an idea in my head, but I'm curious about details.
skeptrune 12 hours ago [-]
What would sales engineers be responsible for at a company like intel? I thought that was more of a SaaS thing.
flanfly 13 hours ago [-]
Props to Intel duping AMD to buy Xillix for whooping $50B
Panzer04 7 hours ago [-]
AMD bought an overpriced company with their own overpriced stock. Probably not as bad as it might look.
MangoCoffee 14 hours ago [-]
What a waste! I can never understand corporate thinking and how CEOs get such massive fucking pay for decisions like this.
Intel paid $16.7 billion in 2015 and sold it for $8.75 billion?! What about all the money dumped into Altera from 2015 to 2025? How much was that? Is Intel just handing over the FPGA market to AMD?
> Is Intel just handing over the FPGA market to AMD?
Maybe? But who cares. From all of the comments above, I learned that the FPGA market is stalled or shrinking. Even AMD likely overpaid for Xilinx.
Alupis 16 hours ago [-]
I wonder if we'll see more Intel sell-offs, as Tan et al try to get things under control.
Will we see an AMD-esque fab spin-off?
DebtDeflation 2 hours ago [-]
Beyond ensuring adequate cash flow, they need to be 100% focused on getting 18A shipping in volume as soon as possible rather than financial engineering stuff.
nxobject 16 hours ago [-]
Would market regulators allow a single buyer to acquire all of Intel's fabs in one go?
Alupis 15 hours ago [-]
My guess would be no, but I could be wrong. The current administration clearly wants more domestic capability, so even if someone like TSMC/Samsung/etc wanted to acquire as part of their US operations, my gut says it would be challenged.
When AMD spun off their fabs into what became Global Foundries, it was difficult for many to see the upside. However, today, it seems not being tied to any particular fab/tech is one of AMD's biggest advantages.
jsight 15 hours ago [-]
I'd guess that they'll continue to sell off mobileye over time.
Calwestjobs 15 hours ago [-]
Anyone who hoped Lunar Lake to buy Altera give thumbs up.
bcrl 14 hours ago [-]
Has everyone already forgotten about Optane, too?
ein0p 6 hours ago [-]
That whole acquisition always was a head-scratcher to me. OK, you pay nearly 17B for the thing, and then do absolutely nothing with it. How does that work? Even during its best years 17B was big money for Intel.
matt3210 15 hours ago [-]
Intel's problem is that they're trying to deliver short term shareholder value instead of long term stable value.
lvl155 15 hours ago [-]
Not farfetched to think they’re maybe 6-8 quarters away from imploding. They need to survive.
DebtDeflation 2 hours ago [-]
That's a fair assessment. If they're not shipping 18A in volume early next year they likely will end up getting parted out later next year.
sambull 15 hours ago [-]
They'll give any market a good 18 months and then dip
11 hours ago [-]
dboreham 12 hours ago [-]
I left the hardware business in 1992 but seems that nothing has changed. People still think FPGAs are really cool and nobody cab figure out how to make money selling them.
xyst 14 hours ago [-]
Leadership at Intel needs to go. They cannot be allowed to be C-level execs at any company.
Selling out to PE is a signal this company is about to get gutted and loaded to the tits with debt and management fees from PE.
unethical_ban 14 hours ago [-]
Was altera the thing they bought to do some really cool networking/switching/SDN stuff? Paging bcantrill.
wmf 9 hours ago [-]
You're thinking of Barefoot which is also dead. (And Fulcrum before that.)
saagarjha 11 hours ago [-]
You might be talking about Tofino?
sandworm101 15 hours ago [-]
I cannot help myself. My brain parks "Altera" right beside the Weyland and Jupiter Mining corps.
Seems quite cheap. If I was a state I'd buy it. Possibly give stake to the suitable university and then create internships and other learning opportunities. I would also subsidise products to SMEs and then invest more to ensure company can supply defence and other industries, decoupling the country from dependence on other countries from crucial tech.
I mean it's a pipe dream, but why not.
fc417fc802 8 hours ago [-]
I think nationalization is usually frowned on in the west, but your comment about universities got me wondering. It seems small enough that the state could donate it to a consortium of research universities. That'd have to be better than PE in terms of serving the national interest, wouldn't it?
graymatters 9 hours ago [-]
The business geniuses of intel bought Altera for nearly $17B in 2015. Now sold control at valuation barely half of that. After official inflation of over 30% during that time. Which means it lost over 2/3 of its value, (take into account also the lost interest on that money). Given that they gave up control for half the money, it’s effectively as if it was relinquished for 1/3 of what Intel paid for it.
So far - no one of the responsible Intel execs paid any price for such atrocious loss of stakeholders’ value. They need to be in jail and lose their personal wealth to repay the stockholders.
The SEC should investigate them, see whether there was any inside trading to benefit from this horrible value loss.
This criminal lack of performance needs to be brought up during the upcoming shareholders meeting. Responsible must pay the price.
Would you hire again the Intel CEOs, head of Intel Capital, any members of Intel’s board of directors after such abysmal performance?
Rendered at 14:03:36 GMT+0000 (Coordinated Universal Time) with Vercel.
Intel either needs to focus or they need to be bold (and I’d actually prefer they be bold - they’ve started down some cool paths over time), but what they really need is to make up their goddamn minds and stop panicking every other quarter that their “ten-year bets” from last quarter haven’t paid off yet.
Intel really is good at certain kinds of software like compilers or MKL but my belief is that organizations like that have a belief in their "number oneness" that gets in their way of doing anything that it outside what they're good at. Maybe it is the people, processes, organization, values, etc. that gets in the way. Or maybe not having the flexibility to know that what is good at task A is not good at task B.
I had a different non-Intel WiFi card before where the driver literally permanently fried all occupied PCIe slots -- they never worked again and the problem happened right after installing the driver. I don't know how a driver such as this causes that but it looks like it did.
I have found bluez by far the hardest stack to use for Bluetooth Low Energy Peripherals. I have used iOS’s stack, suffered the evolution of the Android stack, used the ACI (ST’s layer), and finally done just straight python to the HCI on pi. Bluez is hands down my least favorite.
However, they somehow managed to bork e1000e driver in a way that certain older cards sometimes fail to initialize and require a reboot. I have been bitten by the bug, and the problem was fixed later by reverting the problematic patch in Debian.
I don't know current state of the driver since I passed the system on. Besides a couple of bad patches in their VGA drivers, their cards are reliable and works well.
From my experience, their open source driver quality does not depend on the process, but on specific people and their knowledge and love for what they do.
I don't like the aggressive Intel which undercuts everyone by shady tactics, but I don't want them to wither and die, either, but seems like their process, frequency and performance "tricks" are biting them now.
so the driver have little to screw up. but they still manage to! for example, the pci cards are all broken, when it's literary the same hardware as the USB ones.
Frequent releases, GitHub repo with good enough user interaction, examples, bug fixing and feedback.
Also, their latest consumer card launches are less then stellar, and the tricks they use to pump up performance numbers are borderline fraud.
As Gamers Nexus puts it "Fake prices for fake frames".
It could possibly come to haunt NVidia or TSMC in decades to come.
I'm old, i.e. "never buy ATI" is something that I've stuck to since the very early Nvidia days. I.e. switched from Matrox and Voodoo to Nvidia while commiserating and witnessing friend's and colleagues ATI woes for years.
The high end gaming days are long gone, even had a time of laptops where 3D graphics was of no concern whatsoever. I happened to have Intel chips and integrated graphics. Could even start up some gaming I missed out on during the years or replay old favourites just fine as even a business laptop Intel integrated graphics chip was fine for it.
And then I bought an AMD based laptop with integrated Radeon graphics because of all that negative stuff you hear about Intel and AMD itself is fine, sometimes even better, so I thought it was fair to give it a try.
Oh my was that a mistake. AMD Radeon graphics is still the old ATI in full blown problem glory. I guess it's going to be another 25 years until I might make that mistake again.
What kind of problems did you see on your laptop?
On the other hand I still think of Intel Integrated GPU in "that thing that screws up your web browser chrome of if you have a laptop with dedicated graphics"
AMD basically stopped supporting (including updating drivers) for GPUs before RDNA (in particular GCN), while such GPUs were still part of AMD's Zen 3 APU offerings.
I did think that given ATI was bought out by AMD and AMD itself is fine it should be OK. AMD always was. I've had systems with AMD CPUs and Nvidia GPUs back when it was an actual desktop tower gaming system I was building/upgrading myself. Heck my basement server is still an AMD CPU system with zero issues whatsoever. Of course it's got zero graphics duties.
On the laptop side, for a time I'd buy something with discrete Nvidia cards when I was still gaming more actively. But then life happened, so graphics was no longer important and I do keep my systems for a long time / buy non-latest gen. So by chance I've been with Intel for a long time and gaming came up again, casually. The Intel HD graphics were of course totally inadequate for any "real" current gaming. But I found that replaying some old favs and even "newer" games I had missed out on (new as in, playing a 2013 game for the very first time in 2023 type thing) was totally fine on an Intel iGPU.
So when I was getting to newer titles, the Intel HD graphics no longer cut it but I'm still not a "gamer" again, I looked at a more recent system and thought I'd be totally fine trying an AMD system. Exactly like another poster said, "post 2015 should be fine, right?! And then there's all this recent bad news about Intel, this is the time to switch!".
Still iGPU. I'm not going to shell out thousands of dollars here.
And then I get the system and I get into Windows and ... everything just looks way too bright, washed out, hard to look at. I doctored around, installed the latest AMD Adrenalin driver, played around with brightness, contract, HDR, color balance, tried to disable the Vari-Brightness I read was supposed to be the culprit etc. It does get worse once you get into a game. Like you're in Windows and it's bearable. Then you start a game and you might Alt-Tab back to do something and everything is just awfully weirdly bright and it doesn't go away when you shut down the game either.
I stuck with it and kept doctoring for over 6 months now.
I've had enough. I bought a new laptop, two generations behind with an Intel Iris Xe for the same amount of money as the ATI system. I open Windows and ... everything is entirely totally 150% fine, no need to adjust anything. It's comfortable, colors are fine, brightness and contrast are fine. And the performance is entirely adequately the same as with the AMD system. Again, still iGPU and that's fine and expected. It's the quality I'm concerned with, not the performance I'm paying for. I expect to be able to get proper quality software and hardware even if I pay for less performance than gamer kid me back when was willing to.
I've seen OEMs do that to an Intel+NVIDIA laptop, too. Whatever you imagine AMD's software incompetence to be, PC OEMs are worse.
Everything just reports it as "with Radeon graphics", including benchmarking software, so it's almost impossible to find anything about it online.
The only thing I found helped was GPU-Z. Maybe it's just one of the known bad ones and everything else is fine and "I bought the one lemon from a prime steak company" but that doesn't change that my first experience with the lemon company turned prime steak company is ... another lemon ;)
It's a Lucienne C2 apparently. And again, performance wise, absolute exactly as I expected. Graphics quality and AMD software? Unfortunately exactly what I expected from ATI :(
And I'm not alone when I look online and what you find online is not just all Lenovo. So I do doubt it's that. All and I mean all my laptops I'm talking about here were Lenovos. Including when they were called IBM ThinkPads and just built by Lenovo ;)
I think this a consequence of the laptop having HDR colour, and the vendor wanting to make it obvious. It's the blinding blue LED of the current day.
What I settled on for quite some time was manually adjusted color balance and contrast and turning the brightness down. That made it bearable but especially right next to another system, it's just "off" and still washed out.
If this was HDR and one can't get rid of it, then yeah agreed, it's just bad. I'm actually surprised you'd turn the brightness up. That was one of the worst things to do, to have the brightness too high. Felt like it was burning my eyes.
Before I used that tool, I tried a few of the built-in colour profiles under the display settings, and that didn't help.
I had to turn the brightness up because when the display is in sRGB it gets dimmer. Everything is much more dim and muted, like a conventional laptop screen. But if I change it back to say, one of the DICOM profiles, then yeah, torch mode. (And if I turn the brightness down in that mode, bright colours are fine but dim colours are too dim and everything is still too saturated).
Also CUDA doesn't matter that much, Nvidia was powered by intense AGI FOMO but I think that frenzy is more or less done.
I've used Linux exclusively for 15 years so probably why my experience is so positive. Both Intel and AMD are pretty much flawless on Linux, drivers for both are in the kernel nowadays, AMD just wins slightly with their iGPUs.
And I'm not even talking about the hassle of the nvidia drivers on Linux (which admittedly has become quite a bit better).
All that just for some negligible graphics power that I'm never using on the laptop.
Back in the day, w/ AMD CPU and Nvidia GPU, I was gaming on Linux a lot. ATI was basically unusable on Linux while Nvidia (not with the nouveau driver of course), if you looked past the whole kernel driver controversy with GPL hardliners, was excellent quality and performance. It just worked and it performed.
I was playing World of Warcraft back in the mid 2000s via Wine on Linux and the experience was actually better than in Windows. And other titles like say Counter Strike 1.5, 1.6 and Q3 of course.
I have not tried that in a long time. I did hear exactly what you're saying here. Then again I heard the same about AMD buying ATI and things being OK now. My other reply(ies) elaborate on what exactly the experience has been if you're interested.
For example, Uber hired a VP from Amazon. And the first thing he did was to hire most of his immediate reports at Amazon to Director/Senior Director positions at Uber.
At that level of management work gets done mostly through connections, favors and networking.
You got there in the end. You get the same outcome with the same corporate incentive.
Both Intel and Google prioritize {starting something new} over {growing an existing thing}, in terms of corporate promotions and rewards, and therefore employees and leaders self-optimize to produce the repeated behavior you see.
The way to fix this would be to decrease the rewards for starting a new thing and increase the rewards for evolving and growing an existing line of business.
It's really hard to fuck these things up. Which they have been trying hard, given the state of youtube and the search engine.
I can see why you have to be "special" to work at these places.
New feature attracts new users and allows for fancy press releases. Nobody cares about press releases about an existing product getting a bug fix are become more stable.
Our society is nothing but "ooh look, shiny!" type of short attention span
Bonuses by juicing revenue numbers
Bigger next job by doing M&A and having really good-looking resume and interview story.
https://corporatefinanceinstitute.com/resources/valuation/mo...
Yeah, that's where my mind went. Executive and upper management salaries seem to be a function of revenue, not profit.
It is, however, a conflict of interest for you to be involved in company B's acquisition of company A (e.g. influencing company B to buy company A), and might even rise to the level of a breach of your fiduciary duty to company B.
If they could not figure how to make it profitable, maybe somebody else should try. (Of course I don't think that the PE company is going to do just that.)
I am beginning to think M&A are just some sort of ego thing for bored megacorp execs, rather than serious attempts to add efficiency and value to the marketplace. (Prove me wrong, bored megacorp execs. I'll wait.)
It was just senseless, Intel doesn't have real or imagined competition from a drone company, it wasn't even close to being in the same market. They just believed the hype about drones being the next big thing and when they found out they were too early they decided they didn't have the patience to wait for drones to become a thing and they killed it. There was no long term vision behind it.
I don't know about "real estate inspection", but another use case was for them to be used in oil rigs in the North Sea to inspect the structure of the rig itself. They had to be self-stabilizing under high winds and adverse weather conditions, and they had to carry a good enough camera to take detailed photos.
Unfortunately, while the technology was there, the market wasn't. Not many wanted to get a $35K drone to be able to sustain this business.
EDIT I just noticed the “inspection” part. Maybe they wanted good zoom to spy on the tenants? (Or maybe that’s a really uncharitable take).
That's the face of it. Labor is a market as well. The impacts of these arrangements on our labor pool is extraordinary. It's a massive displaced cost of allowing these types of mergers to occur born out by the people who stand to gain the least from the merging of business assets.
It seems like a low risk effort to put a promising inexperienced exec in charge of a recent acquisition.
If they're a screw up and run it into the ground, imagine how much damage they could have done in a megacorp position of power.
Megacorp saved (at the cost of a smaller company)
Stock down again? Sell the company you bought 2 years ago.
From the top to the bottom the problem with late stage capitalism is misaligned incentives.
Edit: I wrote "the problem" and I should have written "among the many, many problems"
tick, tock
I don't think PE is responsible for that one.
You have to treat tariffs not as a moat to protect an industry for good, but a runway to give a nascent industry enough room to take off. In a mature industry, tariffs are more likely to keep incumbents uncompetitive and disincentivize investments that would make them more competitive, especially if those are capital-heavy.
Also, tariffs aren't going to be effective if other structural issues exist in an industry that prevent or sharply limit expansion. Like key components having a sole worldwide supplier with a full order book. Or if capital investment to set up a new factory are beyond the ability of the financial markets to provide.
The President does not have the power to create tariffs. Congress does!
The reason the President doesn't have this power is because the economy should not rest on the whims or understanding of any one person.
> The reason the President doesn't have this power is because the economy should not rest on the whims or understanding of any one person.
It's interesting to see that it just takes time for lessons to be un-learned. The reason Smoot-Hawley was such a disaster is that it took hundreds of people to agree that it was good policy in the house, which meant adding tariffs to the bill in favor of the districts they individually represented. The result was an egregiously long list of things being tariffed. They delegated it to the one person specifically because they weren't similarly beholden to so many conflicting pressures.
I don't mean any of this to defend Trump's actions, in fact the opposite: he's essentially managing to do the same thing even without politic pressures to do so. I just mean to say that it is reasonably sane for congress to have delegated tariffs in a limited capacity when this flaw was revealed.
Exhibit A: Navarro being sidelined and Scott Bessent put in charge of running tariff negotiations, after the bond markets spooked.
[0] https://en.m.wikipedia.org/wiki/Peter_Navarro
I'm concerned about the future of FPGAs and wonder who will lead the way to fix these abhorrent toolchains these FPGA companies force upon developers.
https://www.achronix.com/blog/accelerating-llm-inferencing-f...
There's been neural processing chips since before LLM craze [1].
[1]: https://en.wikipedia.org/wiki/Neural_processing_unit#History
A Versal AI Edge FPGA has a theoretical performance of 0.7TFLOPs just from the DSPs alone, while consuming less power than a Raspberry Pi 5 and this is ignoring the AI Engines, which are exactly the ASICs that you are talking about. They are more power efficient than GPUs, because they don't need to pretend to run multiple threads each with their own register files or hide memory latency by swapping warps. Their 2D NOC plus cascaded connections allow them to have a really high internal memory bandwidth in-between the tiles at low power.
What they are missing is processing in memory, specifically LPDDR-PIM for GEMV acceleration. The memory controllers simply can't deliver a memory bandwidth that is competitive with what Nvidia has and I'm talking about boards like Jetson Orin here.
[1] The FFT Strikes Back: An Efficient Alternative to Self-Attention (168 comments):
https://news.ycombinator.com/item?id=43182325
Some FPGA vendors are contributing to and relying, partially or completely, on the open source stack (mainly yosys+nextpnr).
It is still perceived as not being "as good" as the universally hated proprietary tools, but it's getting there.
0. https://ir.quicklogic.com/press-releases/detail/657/quicklog...
1. https://www.designnews.com/semiconductors-chips/is-platypus-...
FPGAs are getting cheaper with each gen, expanding into low cost, high volume markets that were unthinkable for an FPGA 10 years ago. Lattice has an FPGA family specifically targeted to smartphones, and I've been consulting for a high end audio company that wanted to do some dsp, and a cheap FPGA was the best option in the market for the particular implementation that they wanted to do.
It's not sexy growth, but it's growth. Otherwise, we wouldn't had the explosion of the latest years in low end FPGA companies.
lookup sigmastudio dsp, dsp is insanely cheap todo, there is absolutely no need for fpga, what that guy was doing was either nonsense or it was in 1995. which are both irrelevant points, or rather you provided examples that show fpga are irrelevant, no growth market.
(how many audio devices were using TMS320 dsps even before and after ipod was a thing...)
If FPGAs are not a growing market, how come we have gone from 2 companies (we'll ignore niche space stuff) to ~10 in the last 20 years? Not many IC fields where there is a growth in manufacturers instead of consolidation...
I think Xilinx did a fine job with their AI Engines and AMD decided to integrate a machine learning focused variant on their laptops as a result. The design of the intel NPU is nowhere near as good as AMD's. I have to say that AMD is not a software company though and while the hardware is interesting, their software support is nonexistent.
Also, if you're worried about FPGAs that doesn't really make much sense, since Effinix is killing it.
https://www.synflow.com/
ETA they also paid out almost $10 in dividends.
Ouch - your work is so good we will pay 10x what it is worth, because we are not good enough to do it.
But you are not good enough for us. Maybe they couldn't a binary tree.
That was before I learnt about the many and varied ways in which Intel sabotages itself, and released that Intel's underperformance has little to do with a lack of good technical ideas or talent.
I.e. I was young and naive. I am now considerably less young, and at least a little less naive.
Intel soon discovered the obvious, which is that customers with applications well-suited to FPGAs already use FPGAs.
https://www.doc.ic.ac.uk/~wl/teachlocal/arch/papers/cacm19go...
And maybe this is/was a pipe dream - maybe there aren't enough people with the skills to have a "golden age of architecture". But MSFT was deploying FPGAs in the data center and there were certainly hopes and dreams this would become a big thing.
That's in no small part because the industry & tools seem to be stuck decades in the past. They never had their "GCC moment". But there's also inherent complexity in working at a very low level, having to pay attention to all sorts of details all the time that can't easily be abstracted away.
There's the added constraint that FPGA code is also not portable without a lot of extra effort. You have to pick some specific FPGA you want to target, and it can be highly non-trivial to port it to a different one.
And if you do go through all that trouble, you find out that running your code on a cloud FPGAs turns out to be pretty damn expensive.
So in terms of perf per dollar invested, adding SIMD to your hot loop, or using a GPU as an accelerator may have a lower ceiling, but it's much much more bang for the buck and involves a whole lot less pain along the way.
You can certainly pencil out FPGA or ASIC systems that which would attain high levels of efficient parallelism if there wasn't memory bandwidth or latency limits but there are. If you want to do math that GPUs are good at, you use GPUs. Historically some FPGAs have let you allocate bits in smaller slices so if you only need 6 bit math you can have 6 bit math but GPUs are muscling in on that for AI applications.
FPGAs really are good at bitwise operations used in cryptography. They beat CPUs at code cracking and bitcoin mining but in turn they get beat by ASICs. However there is some number of units (say N=10,000) where the economics of the ASIC plus the higher performance will drive you to ASIC -- for Bitcoin mining or for the NSA's codebreaking cluster. You might prototype this system on an FPGA before you get masks made for an ASIC though.
For something like the F-35 where you have N=1000 or so, could care less about costs, and might need to reconfigure it for tomorrow's threats, the FPGA looks good.
One strange low N case is that of display controllers for retrocomputers. Like it or not a display controller has one heck of a parts count to make out of discrete parts and ASIC display controllers were key to the third generation of home computers which were made with N=100,000 or so. Things like
https://www.commanderx16.com/
and are already expensive compared to the Raspberry Pi so they tend to use either a microcontroller or FPGA, the microcontroller tends to win because an ESP32 which costs a few buck is, amazingly, fast enough to drive a A/D converter at VGA rates or push enough bits for HDMI!
Rapid product development. Got a project that needs to ship in 6-9 months and will be on the market for less than two years in small volume? Thats where FPGAs go. Medical, test and measurement, military, video effects, telepresence, etc.
The problem (for Intel) is that you don't sell billions of dollars of FPGAs into a mass market this way.
First off, clock rates on an FPGA run at about a tenth that of CPUs, which means you need a 10× parallelism speedup just to break-even, which can be a pretty tall order, even for a lot of embarrassingly parallel problems.
(This one is probably a little bit garbled) My understanding is that the design of FPGAs is such that they're intrinsically worse at getting you FLOP/memory bandwidth number than other designs, which also gives you a cap on expected perf boosts.
The programming model is also famously bad. FPGAs are notorious for taking forever to compile--and the end result of waiting half an hour or more might simply be "oops, your kernel is too large." Also, to a degree, a lot of the benefits of FPGA are in being able to, say, do a 4-bit computation instead of having to waste the logic on a full 8-bits, which means your code also needs to be tailored quite heavily for an FPGA, which makes it less accessible for most programmers.
Intel tried to get around this problem by having a common framework. So one compiler (based on clang) with multiple backends for their CPUs, FPGAs, and GPUs. But in practice it doesn't work. The architectures are too different.
We run such oscillators as dummy payloads for thermal tests while we are waiting for the real firmware to be written.
So selling FPGA's was a bad move? Or was the purchase price just wildly out-of-line with the--checking...$9.8B annual market that's expected to rise to $23.3B by 2030?
Yes, but pairing an FPGA somewhat tightly integrated with an actually powerful x86 CPU would have made an interesting alternative to the usual FPGA+some low end ARM combo that's common these days.
And maybe it would have lead somewhere. Perhaps. But they didn't.
Intel: Welcome, Altera. We'd like you to integrate your FPGA fabric onto our CPUs.
Altera: Sure thing, boss! Loads of our FPGAs get plugged into PCIe slots, or have hard or soft CPU cores, so we know what we're doing.
Intel: Great! Oh, by the way, we'll need the ability to run multiple FPGA 'programs' independently, at the same time.
Altera: Ummmm
Intel: The programs might belong to different users, they'll need an impenetrable security barrier between them. It needs full OS integration, so multi-user systems can let different users FPGA at the same time. Windows and Linux, naturally. And virtual machine support too, otherwise how will cloud vendors be able to use it?
Altera: Uh
Intel: We'll need run-time scaling, so large chips get fully utilised, but smaller chips still work. And it'll need to be dynamic, so a user can go from using the whole chip for one program to sharing it between two.
Intel: And of course indefinite backwards compatibility, that's the x86 promise. Don't do anything you can't support for at least 20 years.
Intel: Your toolchain must support protecting licensed IP blocks, but also be 100% acceptable to the open source community.
Intel: Also your current toolchain kinda sucks. It needs to be much easier to use. And stop charging for it.
Intel: You'll need a college outreach program. And a Coursera course. Of course students might not have our hardware, so we'll need a cloud offering of some sort, so they can actually do the exercises in the course.
Altera: I guess to start with we
Intel: Are you profitable yet? Why aren't you contributing to our bottom line?
As to why it didn't work, well, I'm not plugged into this space to have a high degree of certainty, but my best guess is "FPGAs just aren't that useful for that many things."
But they didn't actually sell it. At least not in any form anybody could buy. So, yeah, we get the OP claiming it was an obvious technological dead-end.
And if they included it on lower-end chips (the ones they sold just a few years after they brought Altera), we could have basically what the RasPI 2040 is nowadays. Just a decade earlier and controlled by them... On a second thought, maybe this was for the best.
FPGAs are not ideal for raw parallel number crunching like in AI/LLMs. They are more appropriate for predictable real-time/ultra-low-latency parallel things like the the modulation and demodulation of signals in 5G base state stations.
Intel was an early player to so many massive industries (e.g. XScale, GPGPU, hybrid FPGA SoCs). Intel abandoned all of them prematurely and has been left playing catch-up every time. We might be having a very different discussion if literally any of them had succeeded.
Did AMD massively overpay, or has the FPGA market fundamentally shifted? Curious to see how this new benchmark ripples into AMD’s stock valuation.
My anecdotal example would be high end broadcast audio processors. These do quite a bit beyond the actual processing of audio, in particular, into baseband or even RF signal generation.
In any case these devices used to be fully analog, then when they first went digital were a combination of DSPs for processing and FPGAs for signal output. Later generations dropped the DSP and did everything in larger FPGAs as the larger FPGAs became available. Later generations dropped the whole stack and just run on an 8 core Intel processor using real time linux and some specialized real time signal processing software with custom designed signal generators.
The high core and high frequency CPUs became good enough and getting custom made chips became exceptionally cheap as well. FPGAs became rather pointless in this pipline.
The US military, for a time, had a next generation radio specification that specifically called for the use of FPGAs, as that would allow them to make manufacturer agnostic radios and custom software for them. That never panned out but it shows the peak use of FPGAs to manage the constraints of this time period.
I’ve said it before, Intel is where technology companies go to die. Fortunately while Altera is probably a mess of useless Intel drone MBAs, there’s a decent core that can be salvaged. Best of luck to them.
>Intel flogs off majority stake in Altera to private equity for $4B
>Buy high, sell low: FPGA biz cost x86 giant $16B decade ago
for those not up on this stuff
Selling now also makes sense. There was only one serious competitor in 2015. Now you got Tariffs both ways to the main place where everything is build, and said place has own homegrown vendors like GOWIN, Sipeed, Efinix. But the biggest reason is amount of stuff designed in the West/Taiwan is falling with China taking over actual product design.
https://itif.org/publications/2024/08/19/how-innovative-is-c...
>In 2015, China released its “Made in China 2025” (MIC 2025) strategy, which refined some of these targets, setting a goal of achieving 40 percent self-sufficiency in semiconductors by 2020 and 70 percent by 2025.
https://en.wikipedia.org/wiki/Made_in_China_2025
>In 2024, the majority of MIC 2025's goals were considered to be achieved, despite U.S. efforts to curb the program.
Products coming out of China no longer use STM microcontrollers, Vishay/Analog mosfets/diodes and Altera/Xilinx FPGAs. Its all Chinese semiconductor brands you never heard about. Good example is this teardown of Deye SUN-5K-SG04LP1 5kW hybrid solar inverter https://www.youtube.com/watch?v=n0_cTg36A2Q
This was the heyday at Intel. I left within a year because I noticed that the talent that was respected, compensated and influential at Intel was the sales engineers. I can't pretend to have known that would lead to the decline of the company, but I knew that as an engineer uninterested in sales, that it wasn't the place for me.
Intel paid $16.7 billion in 2015 and sold it for $8.75 billion?! What about all the money dumped into Altera from 2015 to 2025? How much was that? Is Intel just handing over the FPGA market to AMD?
https://download.intel.com/newsroom/2021/archive/2015-12-28-...
Will we see an AMD-esque fab spin-off?
When AMD spun off their fabs into what became Global Foundries, it was difficult for many to see the upside. However, today, it seems not being tied to any particular fab/tech is one of AMD's biggest advantages.
Selling out to PE is a signal this company is about to get gutted and loaded to the tits with debt and management fees from PE.
https://subnautica.fandom.com/wiki/Alterra_Corporation
I mean it's a pipe dream, but why not.
The SEC should investigate them, see whether there was any inside trading to benefit from this horrible value loss.
This criminal lack of performance needs to be brought up during the upcoming shareholders meeting. Responsible must pay the price.
Would you hire again the Intel CEOs, head of Intel Capital, any members of Intel’s board of directors after such abysmal performance?