The Abit BP6 taught me so much about multithreaded programming. Programs that previously worked fine crashed instantly due to incorrect locking. It really forced me to think differently about concurrent programs.
After that I never found multithreaded programming particularly difficult. Challenging at times yes, but thanks to my newfound mental model not difficult.
I had those brass-looking cylindrical coolers[1] from Zalman, and the two of them next to each other was quite distinctive.
Had the motherboard for many years as a homelab server.
I bought a few more Abit boards after that, but the capacitor plague made me switch to Asus IIRC, and then they folded.
But the BP6 will forever be with me as a incredibly cool motherboard that did something unique in the consumer space.
They were backwards compatible with socket 5 (you had to set the motherboard jumpers voltages though).
Some of these boards had both sdram and edo ram slots along with an agp slot, pci slots and an isa slot.
So you had an era where motherboads could take a P-75 or an amd k6 550 cpu. They could take ram scavanged from an old 486 (edo ram) or you could put in faster ram. You could run a pci grapchics card if it’s all you had or you could run an agp card. I used my old 486s isa soundblaster awe in that board for a long long time since pci was of no benefit for a soundcard.
The only set of cpus not compatible were the slot and socket 370 cpus. But they were pretty expensive anyway and it was fun to be able to frankenstein computers so much back in the day.
I did love that era, in terms of it providing a young frugal person with the opportunity to buy upgrades piecemeal. It felt like there was more generational overlap, as you describe, so it was possible to just go out and buy a new CPU, or a new graphics card, for a few hundred carefully saved dollars of birthday and christmas money, and get a sizeable upgrade in performance. That era is over, especially with the current pricing crunch.
What I am hoping for is that this leads to a resurgence for all those used computers out there... plenty of great machines from the last decade that should have no problem being competent workstations for 90% of people's needs for the next decade onward if treated well. This is where open standards and open source truly shine.
It was an era where there was actual competition in the motherboard space as different vendors tried to outdo each other with their northbridge and southbridge and especially the connection between them. Computer magazines at the time actually benchmarked motherboards. Then Intel and AMD slammed the door shut on that market by moving the important functionality into the chip and now nobody cares about the motherboard very much.
Pretty sure I had a Pentium 4 mobo that was kind of like that in 2002-2003 timeframe. Was still rocking my old ISA Sound Blaster 16 (the big ass one with the connector for a CD-ROM drive) alongside a Radeon 7500 in the AGP slot.
It wasn't much but I could run Alice, Max Payne, GTA 3, Dungeon Siege on there, all at like mid settings, so I was a pretty happy camper for a high school kid putting paper route money into my own PC.
ISA slots were definitely rather rare on motherboards by the time you got to the Pentium 4 era, so that's cool that you managed to find one that also offered DMA, since I believe Sound Blaster cards needed that to properly function.
I think I would have done the same with my AWE64 Gold if that was still an option for me in the early 2000s.
Having googled it a bit now, it's fully possible I have my wires crossed, since I know that P4 machine had the SiS 645 chipset which of course had built in audio.
I definitely used the Sound Blaster with my 486DX100, and I recalled migrating it to at least one other machine after that; it was nice for the joystick port and also the better wavetable synth on classic games.
I posit the opposite. Super Socket 7 motherboards were a terrible choice aimed at suckers trapped by sunk cost fallacy.
>along with an agp slot
Non working AGP slot, or rather working until you tried to play 3d games with 3D accelerator actually using AGP features, then you got crashes no matter the chipset (VIA, ALI). Solution was switching to x1 mode, disabling sideband signaling or just swapping to a 3dfx card.
$76 K6-2/550 is slower than $73 Celeron 366, not to mention pulverized in benchmarks if you happened to find Celeron capable of 100MHz fsb.
Old slow ram makes K6 setup even slower. You would think the benefit were much cheaper motherboards, but even that wasnt the case. SS7 boards started at ~$75 while Abit BE6-2 was $90 and cheapest 440BX ones (P2XBL) $65. K6-2/550 3DNow! (100MHz Bus) $90 vs Celeron 500 $93 https://archive.org/details/computer-shopper-2000-07/page/n3...
Slot1 made much more sense, only release of K7 made AMD competitive again with Duron on the low end and Athlon way ahead of P3.
Interesting read. I had the Abit BP6 and it was a killer in performance/price. The problem I had with it wasn't the capacitors but rather that the PCB itself was a bit thin to support 2x CPUs/fans.
Another cool thing was that the BP6 supported Ultra DMA/66 (aka ATA/66) and it did so by adding a second controller so you had twice as many buses. Looking a pic of it now, it really was a Franken-machine with AGP, PCI, ISA busses too.
> Intel Celeron 500 Retail version, with warranty and CPU fan and heat sink.
(Egghead $135.99 + free UPS Ground = $135.99)
The box was my workstation, and for a time also a public Web server on ADSL. I never actually added a second Celeron (cost money, and I still wasn't feeling CPU pressure) nor the UDMA-66 (reported to be less reliable).
Ah, I kept that BP6 for 10 years before selling it.
It meant I could write multithreaded concurrent software and run it at home with LinuxThreads (https://en.wikipedia.org/wiki/LinuxThreads) then NPTL (native Posix threading lib).
Mine was not very stable under even moderate overclocking though!
This brings me back. My first DIY PC used an Abit motherboard. It was a great computer and was still functional after 5 years before I upgraded. I never knew about the poor quality capacitors. I guess I lucked out.
The thing that jumped out to me was the mention of the engineer jumping ship to DFI. Despite DFI still existing, they stopped making consumer stuff back in 2012, and it seems like they somehow disappeared from the consumer mindset even more than Abit did.
I recall that there was a while during the Athlon 64 era that DFI was the gaming board to get. But I feel like I hear references to Abit more often than DFI.
I think my old Opteron machine with a DFI board is kicking around somewhere still.
A friend of mine won an Athlon XP in a forum contest, I think it was Extreme Systems or Extreme Overclocking. He ended up pairing it with an Abit NF7-S, which I recall being a legendary board at the time. He brought it over to my place and we would LAN Unreal Tournament 2003. Those were the days!
Anyone in PC building during that era knew about Abit. Not really niche for a technical audience, but definitely nostalgic in a way that won’t make sense to anyone who wasn’t into PCs during that era.
Yes, because there weren't really CPUs then that had double the performance.
Celeron CPUs were usually CPUs that shared the same core architecture as the current Pentium standard, but often had a lower core clock speed, lower core memory speed, and/or had smaller L2 caches.
Workloads have different constraints however, and simply doubling cache, clock speed, or memory bandwidth doesn't necessarily double performance, especially when running more than one application at once. Keep in mind, this is Windows 98 /NT/2000 era here.
Symmetric multi-processing (SMP) could be of huge benefit however, far more than simple doubling any of the above factors. Running two threads at once was unheard of on the desktop. These were usually reserved for higher-binned parts, like full-fledged Pentium workstations and Xeons (usually the latter.) But Abit's board gave users a taste of that capability on a comparative budget. Were two cheaper than a single fast CPU? Probably not in all cases (depends on speeds). But Abit's board gave users an option in between a single fast Pentium and a orders of magnitude more professional workstation: A pair of cheaper CPUs for desktop SMP. And that was in reach of more people.
In short, two Celerons were probably more expensive than a single fast Pentium, but having SMP meant being able to run certain workloads faster or more workloads at once at a time when any other SMP system would have cost tons.
>Celeron CPUs were usually CPUs that shared the same core architecture as the current Pentium standard, but often had a lower core clock speed, lower core memory speed, and/or had smaller L2 caches.
This had an interesting side effect: Celerons of that era overclocked extremely well (stable 300 -> 500MHz+), due to the smaller and simpler on-die L2 cache relative to the Pentiums of the era, whose L2 cache was much larger but had to be off-die (and less amenable to overclocking) as a result.
An overclocked dual Celeron could easily outperform the highest-end Pentiums of the era on clock-sensitive, cache-insensitive applications, especially those designed to take advantage of parallelism.
IIRC Celeron cache being on die was actually faster as it was on die, this was mitigated on the Pentiums by there being more of it. It seemed like in games the faster cache performed better.
Another thing that helped the Celeron overclocking craze is Intel seemed to damage the brand badly out of the gate. The original Celerons had no cache at all, performed terribly and took a beating in PC reviews. So even though the A variants were much better this still had a stink on them.
The thing that probably helped the Celeron the most with overclocking though was they gimped them by only giving them a 66mhz front side bus speed. Since you had to increase this number to push the locked multiplier CPU speed up this was an advantage if you were going to overclock as you could buy a capable motherboard and run it at stable 100mhz. Whereas you'd have a lot more system wide problems trying to push a Pentium's 100mhz bus higher.
That was a bit of a two edged sword as the heavily overclocked Celerons would benchmark extremely well, but be somewhat disappointing in actual applications due to the lack of cache space. It was right at the start of the era where cache misses became the defining factor in real world performance. CPUs ran ahead of DRAM and it has never caught back up, even as per-core CPU performance plateaued.
Going from a single CPU to a dual CPU would, in theory, double performance _at best_. In other words, only under workloads that supported multithreading perfectly.
But in the real world, the perceived performance improvement was more than doubling. The responsiveness of your machine might seem 10 or 100x improved, because suddenly that blocking process is no longer blocking the new process you're trying to launch, or your user interface, or whatever.
Very interesting observation. Multicore systems have been fairly standard for the last 10+ years, and while you occasionally notice a misbehaving process hog an entire core, it never visibly impacts system performance because there are still several other idle cores, so you don't notice said "hogs."
It's much rarer to see misbehaving multithreaded processes hog all of the cores. Perhaps most processes are not robustly multithreaded, even in 2025. Or perhaps multithreading is a sufficiently complex engineering barrier that highly parallelized processes rarely misbehave, since they are developed to a higher standard.
The Celeron 300A was the one most folks would go after for this. I don't recall the exact retail pricing at the time, but they were more or less guaranteed to overclock to 450mhz and be fully stable. Typically retail pricing could be had at discount to the published wholesale pricing within a couple months of release due to how quickly the market moved back then.
These were competing with PII processors in 1998, and for folks who wanted to go dual CPU it was the way to go.
There was a whole cottage industry of folks modding these CPUs as a small side hustle for people who were not comfortable with soldering onto CPU pins if you wanted to put these into a SMP system.
Performance really did mostly scale linearly with clock speed back then - but for a single CPU. The dual CPU setups were not nearly as efficient due to software not being as multi-threaded as it is today. The big win were folks with two monitors (rare!) who could run apps on their second monitor while playing games on the first. Typically you would only see frame-rate increases with CPU clock - and of course the very start of the serious 3D accelerator (3dfx, nvidia, ATI) scene back then.
It was certainly the golden age of enthusiast computing - especially for gaming.
> There was a whole cottage industry of folks modding these CPUs as a small side hustle for people who were not comfortable with soldering onto CPU pins if you wanted to put these into a SMP system.
When Intel switched from Slot 1 to Socket 370, there was a market for "slocket" adapters that allowed Slot 1 motherboards to take Socket 370 CPUs. The best of these adapters worked out a way to re-enable SMP on Celerons by tweaking the pin layout to disable the lock Intel had added. What made the BP6 so popular is that it was a native dual-slot Socket 370 motherboard that had this modification built in so it could use unmodified dual Celerons out of the box.
> Performance really did mostly scale linearly with clock speed back then - but for a single CPU. The dual CPU setups were not nearly as efficient due to software not being as multi-threaded as it is today. The big win were folks with two monitors (rare!) who could run apps on their second monitor while playing games on the first. Typically you would only see frame-rate increases with CPU clock - and of course the very start of the serious 3D accelerator (3dfx, nvidia, ATI) scene back then.
Even if you only had one monitor, multitasking was FAR better on a dual-CPU machine than on a single CPU system. For example, if you were extracting a ZIP file, one CPU would get pegged at 100% but the system was still responsive due to the second CPU not having any utilization. If you use a dual-Celeron BP6 system, it's a much nicer and more modern feeling experience than using a single-PII system even with the faster CPU with more cache.
The P3s often cost more than the MSRP at retail too back in the day, as they were supply constrained in period for various reasons, which heavily contributed to the popularity of BP6 builds with enthusiasts. Intel really struggled to ramp up P3 production.
That would be quite the "budget" SMP build. The 366MHz "Mendocino" was based on the prior Pentium II core I believe. So quite the disparity in single-threaded workloads.
The legend was that Celeron 300A CPUs packaged in Malaysia were more overclockable than those packaged in Costa Rica. I specifically hunted down a Malaysian one, and it happily ran at 450 MHz for years.
Not "for some reason"; I didn't see it as relevant. If anything, it being a PII-lite with overclocking disabled makes it seem like a worse option? What am I missing here?
they may have been, yes. back in those days, a CPU with multiple cores were meant for the server or enterprise workstation market and priced accordingly.
The Abit BP6 taught me so much about multithreaded programming. Programs that previously worked fine crashed instantly due to incorrect locking. It really forced me to think differently about concurrent programs.
After that I never found multithreaded programming particularly difficult. Challenging at times yes, but thanks to my newfound mental model not difficult.
I had those brass-looking cylindrical coolers[1] from Zalman, and the two of them next to each other was quite distinctive.
Had the motherboard for many years as a homelab server.
I bought a few more Abit boards after that, but the capacitor plague made me switch to Asus IIRC, and then they folded.
But the BP6 will forever be with me as a incredibly cool motherboard that did something unique in the consumer space.
[1]: https://www.cablesonline.com/soc370airrou.html (except brass finish)
The super socket 7 motherboards were amazing.
They were backwards compatible with socket 5 (you had to set the motherboard jumpers voltages though).
Some of these boards had both sdram and edo ram slots along with an agp slot, pci slots and an isa slot.
So you had an era where motherboads could take a P-75 or an amd k6 550 cpu. They could take ram scavanged from an old 486 (edo ram) or you could put in faster ram. You could run a pci grapchics card if it’s all you had or you could run an agp card. I used my old 486s isa soundblaster awe in that board for a long long time since pci was of no benefit for a soundcard.
The only set of cpus not compatible were the slot and socket 370 cpus. But they were pretty expensive anyway and it was fun to be able to frankenstein computers so much back in the day.
I did love that era, in terms of it providing a young frugal person with the opportunity to buy upgrades piecemeal. It felt like there was more generational overlap, as you describe, so it was possible to just go out and buy a new CPU, or a new graphics card, for a few hundred carefully saved dollars of birthday and christmas money, and get a sizeable upgrade in performance. That era is over, especially with the current pricing crunch.
What I am hoping for is that this leads to a resurgence for all those used computers out there... plenty of great machines from the last decade that should have no problem being competent workstations for 90% of people's needs for the next decade onward if treated well. This is where open standards and open source truly shine.
It was an era where there was actual competition in the motherboard space as different vendors tried to outdo each other with their northbridge and southbridge and especially the connection between them. Computer magazines at the time actually benchmarked motherboards. Then Intel and AMD slammed the door shut on that market by moving the important functionality into the chip and now nobody cares about the motherboard very much.
Pretty sure I had a Pentium 4 mobo that was kind of like that in 2002-2003 timeframe. Was still rocking my old ISA Sound Blaster 16 (the big ass one with the connector for a CD-ROM drive) alongside a Radeon 7500 in the AGP slot.
It wasn't much but I could run Alice, Max Payne, GTA 3, Dungeon Siege on there, all at like mid settings, so I was a pretty happy camper for a high school kid putting paper route money into my own PC.
ISA slots were definitely rather rare on motherboards by the time you got to the Pentium 4 era, so that's cool that you managed to find one that also offered DMA, since I believe Sound Blaster cards needed that to properly function.
I think I would have done the same with my AWE64 Gold if that was still an option for me in the early 2000s.
Having googled it a bit now, it's fully possible I have my wires crossed, since I know that P4 machine had the SiS 645 chipset which of course had built in audio.
I definitely used the Sound Blaster with my 486DX100, and I recalled migrating it to at least one other machine after that; it was nice for the joystick port and also the better wavetable synth on classic games.
I posit the opposite. Super Socket 7 motherboards were a terrible choice aimed at suckers trapped by sunk cost fallacy.
>along with an agp slot
Non working AGP slot, or rather working until you tried to play 3d games with 3D accelerator actually using AGP features, then you got crashes no matter the chipset (VIA, ALI). Solution was switching to x1 mode, disabling sideband signaling or just swapping to a 3dfx card.
1998 with the release of Intel Celeron killed any possible K6 advantage https://akiba-pc.watch.impress.co.jp/hotline/981226/p_cpu.ht... ~120 yen to $1
Celeron 300A MHz 10,440 ~$90
K6-2/300 10,850 ~100
https://akiba-pc.watch.impress.co.jp/hotline/981226/newitem....
ZIDA BXi98-ATX (440BX,ATX,AGP1,PCI4,PCI/ISA1,ISA1,DIMM3) 15,800 ~$140
FIC PA2013 (MVP3,ATX,2MB,AGP1,PCI3,PCI/ISA1,ISA1,DIMM3) 2MB cache 13,800 ~$130
>amd k6 550 cpu
thats year 2000
>The only set of cpus not compatible were the slot and socket 370 cpus. But they were pretty expensive anyway
You are comparing bottom of the barrel AMD CPUs with top spec Pentium 3s. Correct comparison should be against Celerons. January 2000 prices https://akiba-pc.watch.impress.co.jp/hotline/20000617/p_cpu....
K6-III/450 14,550 $140
K6-III/400 8,980 $85
Celeron 300A $57
300A@450MHz beats K6-III/450@550MHz in every possible benchmark.
by June 17 2000 https://akiba-pc.watch.impress.co.jp/hotline/20000617/p_cpu....
Celeron 533A 10,570 $100
Celeron 366MHz 7,700 $73
Duron 600MHz 9,990 $95
K6-III/450 24,800 $236 !??!!?
K6-III/400 14,800 $140
K6-2/550 7,949 $76
K6-2/533 5,970 $57
K6-2/500 5,350 $50
$76 K6-2/550 is slower than $73 Celeron 366, not to mention pulverized in benchmarks if you happened to find Celeron capable of 100MHz fsb.
Old slow ram makes K6 setup even slower. You would think the benefit were much cheaper motherboards, but even that wasnt the case. SS7 boards started at ~$75 while Abit BE6-2 was $90 and cheapest 440BX ones (P2XBL) $65. K6-2/550 3DNow! (100MHz Bus) $90 vs Celeron 500 $93 https://archive.org/details/computer-shopper-2000-07/page/n3...
Slot1 made much more sense, only release of K7 made AMD competitive again with Duron on the low end and Athlon way ahead of P3.
Interesting read. I had the Abit BP6 and it was a killer in performance/price. The problem I had with it wasn't the capacitors but rather that the PCB itself was a bit thin to support 2x CPUs/fans.
Another cool thing was that the BP6 supported Ultra DMA/66 (aka ATA/66) and it did so by adding a second controller so you had twice as many buses. Looking a pic of it now, it really was a Franken-machine with AGP, PCI, ISA busses too.
Yes, mine bowed eventually even though I put non conductive closed cell foam under the cpu areas.
Still, I made good on my promise to never return to single core machines.
When I used the Abit BP6 in a Linux box build, I did it as a one-Celeron budget PC with expandability, and put some notes on the Web at the time:
https://www.neilvandyke.org/cheap-pc-2000/
That page includes pricing info for each component, and how I bought it. For example:
> Abit BP6 Dual PPGA Socket-370 motherboard, UDMA-66, 2 ISA, 5 PCI, AGP 2X, 3 168-pin PC100 ECC, max. 1GB RAM. Retail version. (Essential Computing $120 + $14.25 UPS Ground + $3.60 insurance = $137.95)
> Intel Celeron 500 Retail version, with warranty and CPU fan and heat sink. (Egghead $135.99 + free UPS Ground = $135.99)
The box was my workstation, and for a time also a public Web server on ADSL. I never actually added a second Celeron (cost money, and I still wasn't feeling CPU pressure) nor the UDMA-66 (reported to be less reliable).
Ah, I kept that BP6 for 10 years before selling it. It meant I could write multithreaded concurrent software and run it at home with LinuxThreads (https://en.wikipedia.org/wiki/LinuxThreads) then NPTL (native Posix threading lib).
Mine was not very stable under even moderate overclocking though!
Good times!
I had a lot of good luck with Abit motherboards. They did a 3day burn in before shipping.
I like they show schematics in their materials and still have a sticker from an old celeron build. I booted it up recently and it still works.
This brings me back. My first DIY PC used an Abit motherboard. It was a great computer and was still functional after 5 years before I upgraded. I never knew about the poor quality capacitors. I guess I lucked out.
BX6 r2.0, the motherboard of the first PC I built myself, and still the favorite I ever had to work with.
Abit, there's a name I havent heard in a long long time...
The thing that jumped out to me was the mention of the engineer jumping ship to DFI. Despite DFI still existing, they stopped making consumer stuff back in 2012, and it seems like they somehow disappeared from the consumer mindset even more than Abit did.
I recall that there was a while during the Athlon 64 era that DFI was the gaming board to get. But I feel like I hear references to Abit more often than DFI.
I think my old Opteron machine with a DFI board is kicking around somewhere still.
DFI LanParty was the hot shit in the overclocking community
I had a DFI LanParty with socket 939, an Athlon 3000+ Venice clocked 1GHz over stock, and 512MB of DDR-600. Big baller.
I loved abit motherboards back in the day... sad to see the company die.
A friend of mine won an Athlon XP in a forum contest, I think it was Extreme Systems or Extreme Overclocking. He ended up pairing it with an Abit NF7-S, which I recall being a legendary board at the time. He brought it over to my place and we would LAN Unreal Tournament 2003. Those were the days!
This is a specifically strange article, niche on niche is putting it lightly.
Anyone in PC building during that era knew about Abit. Not really niche for a technical audience, but definitely nostalgic in a way that won’t make sense to anyone who wasn’t into PCs during that era.
> The Abit BP6 was legendary with enthusiasts because it let them make a dual CPU system with cheap Celeron CPUs.
And 2 celerons were cheaper than a CPU with double the performance?
Yes, because there weren't really CPUs then that had double the performance.
Celeron CPUs were usually CPUs that shared the same core architecture as the current Pentium standard, but often had a lower core clock speed, lower core memory speed, and/or had smaller L2 caches.
Workloads have different constraints however, and simply doubling cache, clock speed, or memory bandwidth doesn't necessarily double performance, especially when running more than one application at once. Keep in mind, this is Windows 98 /NT/2000 era here.
Symmetric multi-processing (SMP) could be of huge benefit however, far more than simple doubling any of the above factors. Running two threads at once was unheard of on the desktop. These were usually reserved for higher-binned parts, like full-fledged Pentium workstations and Xeons (usually the latter.) But Abit's board gave users a taste of that capability on a comparative budget. Were two cheaper than a single fast CPU? Probably not in all cases (depends on speeds). But Abit's board gave users an option in between a single fast Pentium and a orders of magnitude more professional workstation: A pair of cheaper CPUs for desktop SMP. And that was in reach of more people.
In short, two Celerons were probably more expensive than a single fast Pentium, but having SMP meant being able to run certain workloads faster or more workloads at once at a time when any other SMP system would have cost tons.
>Celeron CPUs were usually CPUs that shared the same core architecture as the current Pentium standard, but often had a lower core clock speed, lower core memory speed, and/or had smaller L2 caches.
This had an interesting side effect: Celerons of that era overclocked extremely well (stable 300 -> 500MHz+), due to the smaller and simpler on-die L2 cache relative to the Pentiums of the era, whose L2 cache was much larger but had to be off-die (and less amenable to overclocking) as a result.
An overclocked dual Celeron could easily outperform the highest-end Pentiums of the era on clock-sensitive, cache-insensitive applications, especially those designed to take advantage of parallelism.
IIRC Celeron cache being on die was actually faster as it was on die, this was mitigated on the Pentiums by there being more of it. It seemed like in games the faster cache performed better.
Another thing that helped the Celeron overclocking craze is Intel seemed to damage the brand badly out of the gate. The original Celerons had no cache at all, performed terribly and took a beating in PC reviews. So even though the A variants were much better this still had a stink on them.
The thing that probably helped the Celeron the most with overclocking though was they gimped them by only giving them a 66mhz front side bus speed. Since you had to increase this number to push the locked multiplier CPU speed up this was an advantage if you were going to overclock as you could buy a capable motherboard and run it at stable 100mhz. Whereas you'd have a lot more system wide problems trying to push a Pentium's 100mhz bus higher.
That was a bit of a two edged sword as the heavily overclocked Celerons would benchmark extremely well, but be somewhat disappointing in actual applications due to the lack of cache space. It was right at the start of the era where cache misses became the defining factor in real world performance. CPUs ran ahead of DRAM and it has never caught back up, even as per-core CPU performance plateaued.
Yeah; mine ran very stable at 466 for >decade. It was impressive.
You could attempt to head toward ~700 but I never could keep it stable there.
Going from a single CPU to a dual CPU would, in theory, double performance _at best_. In other words, only under workloads that supported multithreading perfectly.
But in the real world, the perceived performance improvement was more than doubling. The responsiveness of your machine might seem 10 or 100x improved, because suddenly that blocking process is no longer blocking the new process you're trying to launch, or your user interface, or whatever.
One thing I've noticed is that the phrase "CPU hog" has faded from common usage
Very interesting observation. Multicore systems have been fairly standard for the last 10+ years, and while you occasionally notice a misbehaving process hog an entire core, it never visibly impacts system performance because there are still several other idle cores, so you don't notice said "hogs."
It's much rarer to see misbehaving multithreaded processes hog all of the cores. Perhaps most processes are not robustly multithreaded, even in 2025. Or perhaps multithreading is a sufficiently complex engineering barrier that highly parallelized processes rarely misbehave, since they are developed to a higher standard.
Just by the release MSRP:
2x Celeron 366 MHz @ $123 each - https://en.wikipedia.org/wiki/List_of_Intel_Celeron_processo...
1x Pentium III 733 MHz @ $776 - https://en.wikipedia.org/wiki/List_of_Intel_Pentium_III_proc...
And that's assuming that performance scales linearly with clock frequency (which it doesn't).
The Celeron 300A was the one most folks would go after for this. I don't recall the exact retail pricing at the time, but they were more or less guaranteed to overclock to 450mhz and be fully stable. Typically retail pricing could be had at discount to the published wholesale pricing within a couple months of release due to how quickly the market moved back then.
These were competing with PII processors in 1998, and for folks who wanted to go dual CPU it was the way to go.
There was a whole cottage industry of folks modding these CPUs as a small side hustle for people who were not comfortable with soldering onto CPU pins if you wanted to put these into a SMP system.
Performance really did mostly scale linearly with clock speed back then - but for a single CPU. The dual CPU setups were not nearly as efficient due to software not being as multi-threaded as it is today. The big win were folks with two monitors (rare!) who could run apps on their second monitor while playing games on the first. Typically you would only see frame-rate increases with CPU clock - and of course the very start of the serious 3D accelerator (3dfx, nvidia, ATI) scene back then.
It was certainly the golden age of enthusiast computing - especially for gaming.
> There was a whole cottage industry of folks modding these CPUs as a small side hustle for people who were not comfortable with soldering onto CPU pins if you wanted to put these into a SMP system.
When Intel switched from Slot 1 to Socket 370, there was a market for "slocket" adapters that allowed Slot 1 motherboards to take Socket 370 CPUs. The best of these adapters worked out a way to re-enable SMP on Celerons by tweaking the pin layout to disable the lock Intel had added. What made the BP6 so popular is that it was a native dual-slot Socket 370 motherboard that had this modification built in so it could use unmodified dual Celerons out of the box.
> Performance really did mostly scale linearly with clock speed back then - but for a single CPU. The dual CPU setups were not nearly as efficient due to software not being as multi-threaded as it is today. The big win were folks with two monitors (rare!) who could run apps on their second monitor while playing games on the first. Typically you would only see frame-rate increases with CPU clock - and of course the very start of the serious 3D accelerator (3dfx, nvidia, ATI) scene back then.
Even if you only had one monitor, multitasking was FAR better on a dual-CPU machine than on a single CPU system. For example, if you were extracting a ZIP file, one CPU would get pegged at 100% but the system was still responsive due to the second CPU not having any utilization. If you use a dual-Celeron BP6 system, it's a much nicer and more modern feeling experience than using a single-PII system even with the faster CPU with more cache.
The P3s often cost more than the MSRP at retail too back in the day, as they were supply constrained in period for various reasons, which heavily contributed to the popularity of BP6 builds with enthusiasts. Intel really struggled to ramp up P3 production.
Thanks for looking up the numbers!
That would be quite the "budget" SMP build. The 366MHz "Mendocino" was based on the prior Pentium II core I believe. So quite the disparity in single-threaded workloads.
You could over-clock the Celeron and get even more performance. Both the slot-1 and ZIF style...
For some reason you left off the part that explains that the Celeron had a PII core.
> Socket 370 era Celeron processors had a Pentium II core, but Intel disabled the ability to change the multiplier to discourage overclocking
Many, but not all. There were Coppermine derivatives eventually: https://www.cpu-world.com/CPUs/Celeron/TYPE-Celeron%20(Coppe...
They may have sought to discourage overclocking by locking the multiplier, but...
People pretty routinely nearly doubled the clocks on Celeron 300As, anyway. :)
The legend was that Celeron 300A CPUs packaged in Malaysia were more overclockable than those packaged in Costa Rica. I specifically hunted down a Malaysian one, and it happily ran at 450 MHz for years.
I remember that as well. The details elude me, but I seem to recall my 300A was running at 464.25mhz on an ABit B7.
Not "for some reason"; I didn't see it as relevant. If anything, it being a PII-lite with overclocking disabled makes it seem like a worse option? What am I missing here?
On the Slot 1 version of that processor you could disable the multiplier lock by cutting one of the pins on the slot.
What CPU had double the performance of a (top-of-the-line) PII CPU at the time?
And then they left the overclocking back door wide open by giving the celerons a 66mhz FSB.
they may have been, yes. back in those days, a CPU with multiple cores were meant for the server or enterprise workstation market and priced accordingly.
Celerons were consumer-grade budget kit.