One pretty good proxy for a country’s technological sophistication is its stock of supercomputers, which enable detailed simulations of phenomena as disparate as global climate, protein folding, and nuclear weapons reliability.
It is also easily quantifiable, since the website Top500 releases lists of the world’s top 500 supercomputers biannually.
Countries | Count | System Share (%) | Rmax (GFlops) | Rpeak (GFlops) | Cores |
---|---|---|---|---|---|
China | 229 | 45.8 | 439,977,239 | 809,977,843 | 26,722,912 |
United States | 108 | 21.6 | 532,180,190 | 754,332,141 | 16,088,768 |
Japan | 31 | 6.2 | 109,436,242 | 170,880,045 | 5,710,372 |
United Kingdom | 20 | 4 | 41,729,303 | 52,509,525 | 1,625,892 |
France | 18 | 3.6 | 43,580,345 | 66,598,837 | 1,792,656 |
Germany | 17 | 3.4 | 60,502,637 | 86,333,952 | 1,575,350 |
Ireland | 12 | 2.4 | 19,789,320 | 25,436,160 | 691,200 |
Canada | 8 | 1.6 | 12,394,820 | 19,389,748 | 405,408 |
Italy | 6 | 1.2 | 31,110,650 | 49,243,746 | 814,864 |
Korea, South | 6 | 1.2 | 21,938,000 | 35,760,556 | 804,740 |
Netherlands | 6 | 1.2 | 9,334,060 | 11,925,504 | 326,880 |
Australia | 5 | 1 | 6,669,188 | 10,232,963 | 257,336 |
Sweden | 4 | 0.8 | 4,653,054 | 6,565,116 | 139,408 |
India | 4 | 0.8 | 8,358,996 | 9,472,166 | 272,328 |
Poland | 4 | 0.8 | 4,604,365 | 6,216,160 | 153,128 |
Russia | 3 | 0.6 | 4,580,250 | 7,940,005 | 178,180 |
Saudi Arabia | 3 | 0.6 | 10,109,130 | 13,858,214 | 325,940 |
Singapore | 3 | 0.6 | 4,308,220 | 5,525,299 | 146,112 |
Spain | 2 | 0.4 | 7,488,800 | 11,781,642 | 172,656 |
Taiwan | 2 | 0.4 | 10,325,150 | 17,297,190 | 197,552 |
Switzerland | 2 | 0.4 | 23,126,750 | 29,347,305 | 453,140 |
South Africa | 2 | 0.4 | 2,152,470 | 2,779,930 | 71,256 |
New Zealand | 1 | 0.2 | 908,892 | 1,425,408 | 18,560 |
Norway | 1 | 0.2 | 953,571 | 1,081,651 | 32,192 |
Brazil | 1 | 0.2 | 1,123,150 | 1,413,120 | 38,400 |
Finland | 1 | 0.2 | 1,250,000 | 1,689,293 | 40,608 |
Czech Republic | 1 | 0.2 | 1,457,730 | 2,011,641 | 76,896 |
In the latest list, which was released a few days ago, China (229/500) has 2x as many top supercomputers as USA (108/500) in latest Top 500 survey, though US maintains parity in total Rmax.
(In general, the two countries have been level pegging since 2016).
Russia is a scientific desert as usual, with 3/500 top global supercomputers (Poland & Sweden – 4; Saudi & Singapore – 3).
On another note, Moore’s Law for supercomputers… remains more or less stalled, as I first pointed out in January 2016.
Summing the EU countries together at least yields 7m cores, so we go above Japan. I wonder if Japan manufacture their own chips? As far as I know, China and the US does while the EU does not.
I’m surprised Saudi Arabia has about the same capacity as Taiwan. Is it some sort of oil prospecting box or just a big idle white elephant?
What is the difference between Rmax and Rpeak? Peak seems to be even higher than max.
I think the latter, because I remember having read about Cray supercomputers at Aramco as a teenager.
kinda surprised anatoly doesn’t see the obvious issue here with regard to china.
china can manufacture it’s own, very inferior, integrated circuits, in huge numbers, and put together large, crude, brute force computer banks. and, that’s about it. it’s pretty similar to how the command economy goverment can also mandate that contractors build gigantic, empty concrete and steel cities that just sit there empty.
fred reed falls for this stuff, surprised to see our man in russia doing the same.
“But dey have so many komputerz.”
huge, power hungry, inferior computers that sit around doing…not much.
check out the huge technical sophistication gap between the US computers (and some of the other nations computers) versus the chinese ones.
but more importantly, what are they doing with all this floating point capability? not much, as far as i can tell. which is always the story with china.
china also makes more steel, concrete, solar photo voltaic panels, burns more coal in more coal power plants, and so on. they get much less bang for their buck, and come up with almost nothing new or novel ever. they barely improve on anything that comes from someplace else.
yeah, eventually they’ll overtake the west, because the west is self destructing. but their brute force emulation and copy model is not really that impressive, or worrying, by itself.
anatoly, what nm node is china on right now? they down at 7 nm? or wait, 12? 14? 22? 32? no? no, they’re not even that far.
On a more optimistic note, Russia seems to be working on a nuclear propulsion spacecraft:
https://www.rt.com/news/443889-mars-nuclear-reusable-russian-rocket/amp/
If only Putin sent a probe to ‘Oumuamua, Russia would become once again the hope of progressives around the world, or something.
https://www.cnbc.com/amp/2018/08/31/huawei-kirin-980-7-nanometer-ai-chip-for-the-mate-20.html
I don’t know shit about the state of Chinese R&D, but based on this, you might not be that knowledgeable either.
i was watching some documentaries about the gulags and the, herculean, effort, to use gulag slave labor to build a 1000 mile railroad across siberia. they started in 1917 or 1918 and they kept building until 1953. when they stopped, less than 100 miles from being finished. due to a certain somebody dying.
but when you look at the rare photos, and, extremely rare, few videos, of the construction…they were literally building it by HAND. with almost no modern mechanized equipment at all. THAT IS MIND BOGGLING. how on earth did they expect to compete with the west long term under such conditions?
watching them spend over 30 years trying to build a 1000 mile railroad BY HAND, makes you realize that IT IS A MIRACLE THEY WERE EVEN IN THE SPACE RACE.
i imagine the current conditions in china are not that much different. better, but still miles behind the west in almost every metric. and it’s simply a herculean effort by a capable and determined opponent that even, barely, keeps them in the game. the chinese at least, allow themselves to copy every single thing they can that comes out of the west. the soviets, my goodness man. no trucks or jackhammers while working on a railroad in siberia in the middle of the winter in 1950?! just shovels and wheelbarrows and guys with no winter clothes.
Those computers in China are no doubt all run by Chinese, the ones in USA probably have a large number of Chinese also running them.
The Soviets also copied lots of things, though eventually they trusted their own designs more (at least in military equipment), as they suited their needs better. But they were willing to learn from the West.
America’s 1,868 mile transcontinental railroad was built by hand.
The Russian Empire and USSR were poor in capital and rich in labor in the time period you describe.
Not necessarily an irrational decision, especially as motorization efforts in the USSR assigned priority to agriculture (free up labor for the cities, reduce feed requirements for draft animals) and the military (defeat Germany).
reiner tor,
“I don’t know shit about the state of Chinese R&D”
you should have just ended your sentence there.
what nm are Sunway SW26010. you know, the CPUs used to build chinese supercomputers? i’ll wait, while you google.
how many supercomputers can be built out of super weak, super down powered ARM cell phone microcontrollers? very few. nothing near the top couple systems.
i respect you as a poster, so don’t take this as a complete personal attack. you usually know what you’re talking about.
this is something i deliberately researched when TaihuLight came out.
china is just doing what they always do. build something, because they can. kinda like when they work on cloning humans. build the biggest, nonsensical thing possible, to try to demonstrate technical parity or superiority.
“America’s 1,868 mile transcontinental railroad was built by hand.”
thanks. that makes the soviet effort look, what, 10 times worse? 100 times worse?
just checking google, the transcontinental railroad was built in 6 years and was twice as long at 1900 miles.
in 1869. you know, when there weren’t ANY cars, trucks, jackhammers, backhoes, bulldozers, or anything.
the soviets were still building the railroad by hand in 1950. that’s AFTER world war 2, for people doing the math. i seem to remember cars and tanks and power tools existing at this point. hell, they existed in 1930.
so, 80 years after the US finished a railroad that was twice as long, in only 6 years, without any power tools or even any vehicles, the soviets were unable to even bring trucks and backhoes and bulldozers to an effort half the size.
you must give serious respect to an effort like that. nearly superhuman. also, staying in the space race, with no integrated circuits, and all spacecraft designed by hand calculation and no computer modeling or guidance. respectability is sky high there, if doomed to failure.
Siberian terrain is less well-suited to building railroads than US terrain. The climate is also horrible.
Building of the First Transcontinental Railroad was indeed impressive, especially given that it was bound-up with the Credit Mobilier scandal and railroad corruption.
But half the terrain was across the flat Great Plains, and was close to existing major industrial centers. A lot of the track up until the Rockies was also near riverine transportation.
Building from the Western side was also done simultaneously, something made possible by California’s favorable climate and good ports.
To be sure, crossing the Rockies and Sierra Nevadas was challenging. But not insurmountable.
Yes, this was 1869. But as I noted about 20th century Russia, for the first half at least it was a capital-poor country. You can visit India today and find workers digging out building foundations and swimming pools by hand. It seems awfully anachronistic, but it’s the correct employment of resources when labor is cheap and abundant.
I’m also not familiar with which Russian railroad you’re talking about. The Russian Empire did successfully build the 5,772 mile Trans-Siberian Railway across terrible terrain far from industrial centers and ports in a 25 year period. That’s obviously not the best pace, but consider that Russia was a very poor and backwards country then wracked by instability and various disasters.
Is it this one?
https://en.wikipedia.org/wiki/Salekhard–Igarka_Railway
The article certainly makes it clear that the climate and terrain are horrible.
@ all:
#1: The run-of-mill High-Performance-Clusters used around the world nearly all use Intel/AMD/ARM processors with thousands of Nvidia/AMD graphics-cards throw in to max out the parallel-computing performance. The comments regarding Chinese quality of their hardware is unwanted/stupid. It’s all standard server-grade hardware anyone could buy.
#2: The manufacturers are: Lenovo (China), Hewlett-Packard (USA), Bull (France), Fujitsu & NEC (Japan). If it’s not one of those companies than it’s some mide-size system-house or IT-service providers like Tata. The only exception is Cray (USA).
#3: The key for utilizing HPC is to have a team of talented programmers, who can let’s say help the National Farmers’ Association make a model of next year’s harvest depending on different model of rain-level. When you have the model, you start building a database (storage-system with fiber-connection to the HPC-cluster) Then you let the formula run on the database by utilizing the crunching-power.
-> The key is that the Top-500 list only contains systems, where the owner/operator makes the effort to run the benckmarks.
-> With an investment of just 40mio USD University of Stuttgart’s new HPC system Hawk” (which is U/C and will be operational in autumm 2019) would number #5 on this year’s list.
-> So with only 10 million USD budget even many developing countries could deploy something, which qualifies for the Top500-list?
-> Conlusion: Think about it:
The CIA has got a budget of 40 billion USD and they can’t afford some HPC to brute-forcing some encrypted SSD?
Your nation’s air-force can’t afford a HPC to run some engagement-models of the S400 SAM-system?
Your hidden-champion MNC (5billion USD turnover per year) cannot afford 25million USD HPC for their R&D department?
IMHO:
It’s more interesting to count the numbers of 3000-USD high-end PC/workstation in every STEM & business-school department of your nations’ colleges.
In that aspect the USA/Western Europe/Japan wipe the floor with the rest of the world.
prime noticer:
This railroad project (and others like it) may have been a make-work project to get undesirables out of the way. In other words the Gulag, all but in name only.
In real-life performance (Rmax), total US still exceeds China, meaning that fewer computer do more calculations overall. Rmax is theoretical value of little meaning. In general though, supercomputers are overrated – most interesting applications are limited conceptually rather than computationally (that is, 10X-100X more power is not going to be a game-changer).
Saudi has a lot of universities where there are foreign professors, researchers, and students. There is a lot of money put into them as an insurance policy for the time after the oil. They are supposed to also improve the level of the Saudi students and make society more scientific minded. The most famous of these universities is KAUST (King Abdullah University of Science and Technology), which has a lot of money and has basically bought itself a world class faculty. It was run for a while by a famous French chemist and I think the current president is also French. I don’t think they are that effective at making the Saudis better researchers, but I do know several people who studied and worked there. It’s a closed off area near Jeddah into which normal Saudi society doesn’t penetrate so even people who work there for five years don’t learn more than a few words of Arabic. There is still no alcohol allowed on campus however. That is probably the place with the supercomputer, and some other universities are also contenders.
A computational researcher I know on the main island of Japan at a major university, is using a cluster in Beijing to do his calculations. He got access through a former Chinese student of his who now went back to do a postdoc. It’s better than the Japanese one he has access to from time to time in terms of allocating more resources for the calculations. So, certainly these developments are welcome… for him.
And yes, this does mean there is no ‘floor-wiping’ of China by Japan in the field of supercomputing. Maybe ‘floor-throwing’, or ‘floor-pinning’ for a few seconds, since my evidence is anecdotal. But no ‘wiping’.
I think that this is an important metric (but difficult to measure). It’s a desire to win, and a refusal to be fazed by current inferiority.
I remember the British consensus of the early 1970’s that the Japanese would never make anything other than joke cars since they were incapable of understanding the finer points of engineering. Ditto with all their other low quality “plasticky” products.
That didn’t turn out too well, and the Chinese are modeling their development on the Japanese experience.
Apple stole Qualcomm chip secrets and gave them to Intel, Qualcomm claims
https://arstechnica.com/tech-policy/2018/09/apple-stole-qualcomm-chip-secrets-and-gave-them-to-intel-qualcomm-claims/
China’s upstart chip companies aim to topple Samsung, Intel and TSMC
https://asia.nikkei.com/Spotlight/Cover-Story/China-s-upstart-chip-companies-aim-to-topple-Samsung-Intel-and-TSMC
In March, Premier Li Keqiang named semiconductors as the top priority of the 10 industries China wants to foster in its “Made in China 2025” initiative. But China’s ambitions were already clear in 2014 when it launched the National Integrated Circuit Industry Investment Fund — better known as the Big Fund — in 2014 with 138 billion yuan ($21.9 billion) in seed capital, which it hoped would turbocharge investment from local governments and the private sector. The Big Fund is in its second phase of fundraising for at least 150 billion yuan. Credit Suisse estimates China’s total investment to be around $140 billion.
China wants to end its reliance on foreign technology — its annual imports of $260 billion worth of semiconductor-related products have recently risen above its spending on oil. It also wants to move its manufacturing sector to higher-value products.
Judging China’s future by its present situation isn’t too smart I’d say. Sure, they’re behind in a lot of areas. They know they’re behind and they have plans to catch up. Given their human capital and organisational capabilities, I wouldn’t put it past them – things could look very different 10, 20, 30 years from now. They are obviously able to execute large, complex projects – just look at the high-speed rail network, or the rapid growth of metro rail systems in cities all over China:
https://www.archdaily.com/871713/the-breakneck-evolution-of-chinese-metro-systems
Their GDP per capita (PPP) is still only $16,500, growing at 6-7% annually. In other words, by 2025 they could get to where Russia is now in power capita terms – with ten times the population.
No. So-called supercomputers are merely products of budgets. And the important systems are not publicized.
But what about intellectual capacity? This is more important than hardware purchases. Let’s look:
https://en.wikipedia.org/wiki/ACM_International_Collegiate_Programming_Contest
Russia totally dominates this competition.
Don’t know whether that is true. I also hear that China’s Software development is lacking, i.e. these supercomputers are not being fully utilized or running code that is less than optimized for the architecture so there is huge waste. Don’t know whether true either.
(As an aside, a big problem with supercomputing applications on highly parallel machines is reliability: your app wants to survive a few nodes dying per hour, which demands particular approaches to save state regularly, but not too aggressively so that you can still get work done.)
(And of course, power efficiency, i.e. Performance per Watt, i.e. operations per Joule (and this includes cooling), is THE number to maximize as you don’t want to add a large gas turbine on every upgrade or even be limited in density by the generated heat, so research in that area, though unglamorous, is much important).
Ok, so. Currently, China Supers use CPUs from abroad (Intel Xeons & Xeon Phi, IBM Power9, Nvidia accelerators), but that is changing. From the latest “Communications of the ACM” (November), which has a section on China:
On the Matrix-2000:
https://en.wikichip.org/wiki/nudt/matrix-2000
Or: https://www.nextplatform.com/2015/07/15/inside-chinas-next-generation-dsp-supercomputer-accelerator/
Okay so from the CACM again:
Here is something about that chip:
https://en.wikipedia.org/wiki/Sunway_SW26010
Being liberated from Intel is much like being liberated from John McCain.
And from the “Report on the Sunway TaihuLight” (http://www.netlib.org/utk/people/JackDongarra/PAPERS/sunway-report-2016.pdf)
From the pure science research metric WFC from NatureIndex.com, an off-shoot of one of the top science journal “Nature”, on 2017 data KAUST was already ahead of IndianaU, BostonU, ArizonaU, StonyBrook, Brown, UCSantaCruz, CarnegieMellon, CaseWestern, Dartmouth, etc. What is more striking is that KAUST was on an positive upward trend while the rest of the US universities in that cluster except one were on negative downward slopes. It is projected when the data for 2018 is finalized KAUST will be on top in that cluster.
http://i64.tinypic.com/332802c.jpg
Why would Ireland need 12 supercomputers, when Russia and Brazil seem to do fine without them? I suspect that supercomputers in Ireland are not really Irish, but belong to US companies like Google, that have offices in Ireland for tax avoidance purposes. We should probably add them to US total.
nuclear weapons reliability
I highly doubt that supercomputers that work on this and other weapons-related projects are included in this list.
Note that leading edge circuit production facilities and companies are centered in the general area. Taiwan in particular, but also Korea (Samsung) and Japan. (There is also Shenzhen at the other end of the market.) It will be interesting to see how things develop.
China also seems to be pushing for the lead in unimportant software areas like AI, surveillance, social networking and so on. There are no European tech giants to take on Silicon Valley, but there are some Chinese ones. The race is clearly across the Pacific, not the Atlantic.
Finally, Moore’s Law is just about over. I think Samsung is currently leading edge here (10nm) while Intel has been struggling (14nm) and Global Foundries has conceded, perhaps wisely. The current manufacturing tech and materials appears to have nearly run their course, the limit of which, by the way, used to be 8nm.
https://www.irishsupercomputerlist.org/lists/november-2017/
Perhaps it was German and Italian POWs doing the digging as part of their punishment.
There were an awfully huge number of them around in the USSR of 1950.
I’m an optimist on China based off first hand experiences but when I see tables like this I can’t help but think of tables showing parity in tractor production between the USSR and the US in the 1930s and how the whole world saw such comparisons as legitimate at the time.
on the contrary those are exactly the DOE/LLNL/Oak Ridge machines. They are always proudly on the tops list. Manufacturers are also keen to say they have built this and that machine.
ARM aint used in supers what are you writing.
They may be used in “cloud centers”
Jewgle has a big presence in Ireland to skip on paying tax, so I assume many of those are their computers.
Ya. As you asserted such an ancient, inferior, integrated circuits can beat the pants off the advanced US systems and toping the Top500 lists for two years, and the latest second top Sierra system only just beat it by slight margin. What was US doing hibernating in the last 5 years not showing anything. That shows the state of pathetic US technology and how much it has caught up. All show and no grunt power.
Hey, your Power9 and the future Power10 are 14nm technology. China alread has tested the home made Dhyana (AMD EPYC) system with 14nm technology at Rank 38 and will soon move to EPYC 2 with 7nm technology while US seems to be stuck with the single 14-10 nm tech for quite some time. Ha ha ha.
OT
Last information on ‘Oumuamua. The new Russian nuclear propulsion probe should go there instead of Mars.
https://gizmodo.com/study-finds-weird-interstellar-object-oumuamua-is-indee-1830456460
Progressives are not interested in intelligent life, here or elsewhere. They would only be appeased by seeing the first transgender Chechen president of the RF.
Well, I didn’t mean politically progressives, I meant just those who hope for the progress of mankind.
Sorry. I couldn’t resist.
Indeed it’s a good idea, and as AK keeps saying, people love winners. However, there’s a story about `the Chief Designer’ telling Khrushchev that the cost of a moon mission was irrelevant, and Khrushchev replying that he wasn’t going to spend unlimited amounts of resources just to beat the Americans to the moon when so many Khrushchevki still needed building. Perhaps Khrushchev was mistaken about costs and benefits, and perhaps Putin would be too, but I have no idea how much a ‘Oumuamua mission would cost and whether it would be worth it to Russia in its current state of development. Comparative advantage and all that.
On the other hand, perhaps the time has come for Russia to go balls to the wall.
But an undergraduate-student in some 3rd rate college in some Tier-5 city in China has to wait 2 hours until the department’s PC-room’s Intel Core-2-Duo-system has finished the traffic-flow simulation for his thesis.
His Japanese counter-part utilize his college’s workstation-pool. Simulation is done on an AMD Threadripper 16-core system and is finished in 10min.
This is the part where many Western/Japanese institutions/companies are still ahead.
China is currently gearing on the big guns, but the soldiers are still lakcing kevler-wests. For now.
The ‘Oumuamua mission would be worth way more than a mission to the Moon or to Mars. This is the first and (so far, at least) the only object from outside the solar system to come our way. And a pretty weird one at that.
Besides, developing the technology makes it probably worthwhile anyway. Russia could be the leader of space transportation for a long time to come.
All so-called ‘supercomputers’ are vanity white elephant projects. I’d bet that there isn’t a single one in the world that’s actually doing something that needs that computing power.
What makes you think so? There are lots of problems (weather forecasting being one obvious example) which need a lot of computing power.
There is no lack of HPC jobs that still need ‘big iron’ (a nostalgic term). From virtual wind tunnels and FEM to computational chemistry to the classified stuff.
I get the impression these prestige systems usually get booked by the calendar to run something for a few weeks. Also that it’s not unusual to share one between multiple smaller jobs. (At least on the visible academic side.)
Supercomputer is sophisticated pile of metal and silicon. Software and how it is used is the important part.
Ordinary desktop machines you use today would be supercomputers two decades ago. Yet typical currently available software is slow, crappy, hard to use and often waste of time.
Two points:
a) Regular commodity computers are now computationally fast enough that building special ‘supercomputers’ now makes no sense. The biggest and baddest ‘supercomputers’ today live in bitcoin mining centers, and those are designed to waste resources on purpose.
b) We now understand that solving hard problems is made easier by careful application of computational statistics (a.k.a. ‘deep learning’), and the best way to do ‘deep learning’ is by aggressively curating and massaging your input data, not throwing resources at the problem.
Old-school ‘supercomputers’ are still being used, but they’re mostly used for producing the kind of citationbait meaningless academia that’s useless for the real world.
These ‘supercomputers’ will not solve the interesting problems of the future.
Soviet hardware engineers often thought Western hardware design to be crappy. So why didn’t they produce better hardware? The issue was, Soviet industry couldn’t produce it. You can have great design talent, without production talent it’s all worthless.
Something similar was with software. Behind the iron curtain software developers often developed great software which managed to utilize low resources (slow and unreliable processors, low working memory, low disk space, etc.) with great efficiency. In the West, it had little value: you could buy a bigger machine, after all.
All this could’ve been a misallocation of talent, hardware designers and software developers were smarter while production was led by dumber people than in the West.
Congratulations, you managed to be not even wrong a record-setting number of times there.
USA was producing 200,000 tractors a year at the end of the Roaring ’20s, despite low commodity prices through this period. In the ’30s this dropped to 20,000 owing to the Great Depression, Dust Bowl, and further price declines.
USSR first took first place in 1937 with production of 44,000 units vs. 29,000 for the USA. Nice achievement, but the USA already had a much larger stock of tractors. And American tractors were superior as they had rubber tires, three-point hitches, etc.
Then there was the fact that America was vastly ahead in fertilizers, pesticides, hybrid seeds, planting equipment, etc.
It was also far easier for American farm produce to get to market given the pervasive motorization of American society and massive “farm road” building campaign in the interwar years. USSR still struggled with this right down to the end of communism, and in the postwar era never developed anything like the American cold-chain storage system, advanced food processing plants, etc.
So yes, foolish to look at just one metric.
Were the production people stupid, or constrained by institutional factors outside of their control?
Consider:
• Civilian enterprises were not authorized to reject defective inputs (insane)
• The planners’ emphasis on gross output indicators meant that there were no incentives for efficiency (had very bizarre effects like keeping machine tools from the 1930s in operation well into the 1970s)
• Assured consumer demand meant no incentive to improve product quality or features
None of these applied to the defense industry. They were permitted to reject defective inputs. There was (outside of WW2) no emphasis on maximum production, rather only to produce the number of military goods specified by the Defense Ministry. As the defense industry didn’t wish to spend more Roubles than required, reducing input costs was a priority. And needless to say the Ministry of Defense had stringent requirements for product quality and features.
I had family involved in the specialty steels industry who did business in the USSR and reported insane things like 3″ thick cast iron gutters. More production comrades!
With respect to the computer industry in general, the control-freak nature of communist parties meant that they didn’t want most people getting access to computers or even photocopiers.
Of course I suppose given such a structure of incentives, probably it didn’t attract talented people to begin with. Vicious cycle.
Eastern Bloc cloned and produced twenty years old IBM machines. I remember a Czech company building wardrobe-sized magnetic tape memory machines in the 1980’s, when hard disks were already widely used in the West. As an young electronics hobbyist, I was surprised by archaic style of their circuit boards.
Generally, [Czechoslovak] electronics industry lagged much behind (perhaps except for the defense). It was perceived as merely consumer industry, so it got low priority. Due to embargo and lack of hard currency for imports, the industry didn’t face outside competition, so morally and technologically obsolete goods were produced. Communist decision makers couldn’t fanthom, why would ordinary people want a personal microcomputer, so they completely ignored this trend. (There were few homemade designs, built in minuscule numbers.)
I personally do not believe in mythical ability of programmers behind the iron curtain, to overcome hardware problems by their ingenuity. In Czechoslovakia, there were only few university departments where programming was taught, and they lacked proper machines. They used ancient non-interactive time sharing behemots, where you submitted stack of punched cards and received the results on the next day. With no immediate feedback from the computer, it is hard to become a great coder.
The USSR were pioneers of the RISC architecture of microprocessors. Boris Babian and team pulled off many impressive feats but unfortunately for them the USSR began to collapse just as they had more or less caught up with the US in terms of processor performance..well not completely caught up but close enough. They managed to pull off things like the fully auto piloted Buran space shuttle launch by the end of the 1980s.
About the Czech supercomputer.
In the table Czech Republic has one item, on the last place. It is a machine financed by the EU funds “for science”. It was built in 2015.
Last year, Czech Minister of Finance (and director of Microsoft Czech Republic during the 1990’s) said in a parliament speech about R&D investments:
Shouting ‘reees’ really hard doesn’t make something true.
Fact is that no supercomputer will be ever used to solve an important problem of the future.
They’re interesting pieces of hardware, but the world moved on. The important problems can’t be solved by brute force.
Peak is the theoretical maximum throughput we could get, given the hardware.
Max is the maximum measured performance obtained when running the benchmark harness of LINPACK, a linear algebra library that exploits the low-level features of the hardware
http://www.netlib.org/linpack/
This set of tests is not only relevant, but also very important, because most important simulation problems amount, after a series of discretization steps, to solving a gigantic linear system of the form LHS * x = RHS, where the right-hand side RHS is known, and so is the matrix LHS, while x represents the set of unknowns — typically, the values of the continuous function at a number of nodal points.
This is the crux. I was about to write about the issue but found your comment already written… In any event, for all who do not work in that field, that’s the key feature to remember. The HW side of things is just one aspect, and not necessarily the key. If you have mediocre programmers (like the Chinese have despite the fact that there are many outliers, simply because of their population size) then, that’s that. This is also why relatively small countries like Germany or France (and Switzerland) are doing extremely well in relative terms. And of course, the US is indeed totally “wiping the floor” as a result of a combination of multiple factors, including accrued momentum in the field (such as the programing languages themselves use English statements).
Japan once had the #1 machine in the TOP500, over a decade ago. It was called the Earth Simulator. It was supposed to demonstrate Japanese superiority in the field. And what happened next?
I agree with you here for the most part, except at least for the very important case of this field of engineering where testing the full assembly is no longer permitted due to international treaties. If you look at where the main US and French supercomputer are, and what they are used for, you will see this immediately. I suspect this is also what the Chinese are really interested in.
Regarding the other applications, I am, like you, very dubious. Small garbage inputs into a supercomputer yield massive garbage, and the longer you run the CompSimp, the bigger the pile of garbage. Think weather forecasting.
Weather forecasting is certainly very good now, compared to what it was in the 80s. It’s usually good for the next few days, but gets progressively more and more unreliable after that.
Japan’s history in computing is somewhat interesting.
Japanese companies completely overtook American ones in memory chips in the 1980s, which is what famously prompted Intel’s legendary Andy Grove to drop that product line and focus exclusively on microprocessors. For awhile, it looked like Japanese firms would overtake American ones in even this. NEC reverse engineered Intel’s CPUs and manufactured them more reliably and to run at higher speed.
See here for instance: https://en.wikipedia.org/wiki/NEC_V20
For once, the US government responded aggressively to a foreign technological challenge, probably because semiconductors were vital to national defense and widely understood to be an industry of the future.
The US government and industry organized an industry consortium to pool development resources–matched with government funds. It was known as SEMATECH. This, combined with the simultaneous devaluation of the US Dollar (1986 Plaza Accord) worked.
At the same time the Japanese shot themselves in the foot by attempting a Project Apollo style moonshot approach to completely leapfrog to the next generation of computing technology and completely dominate the world industry. See here: https://en.wikipedia.org/wiki/Fifth_generation_computer
In focusing all their resources on leapfrogging, they ended up being bypassed by the rapid evolutionary process of x86 architecture (something similar happened to alternative American efforts like PowerPC and Itanium). It’s also worth remembering that the 1980s featured artificial intelligence and robotics fads just like today, which the Japanese felt required these computers.
Today Japan still makes memory chips, and it has a highly profitable niche making embedded and automotive semiconductors. But it long ago lost relevance as a global computing power of note. And in software Japan is even less relevant. Reportedly Japanese programmers are twice as productive as American ones, but Japanese software companies primarily develop bespoke software for Japanese corporations and thus have no products to market on a large scale.
That said, Japan remains vitally important in the global computing industry supply chain because all semiconductor-grade silicon is produced by just two Japanese firms. Once America’s Monsanto and Germany’s Wacker Chemie competed, but they exited the market long ago. Much of the equipment required to manufacture semiconductors is also produced in Japan, though in this they lack a monopoly.
I have seen a couple of years ago a website that compared predictions to actual outcomes over the Western Alps region. It was dismal.
Certainly in some areas it must be better; but the instabilities of the oceanic climates combined with the extreme sensitivity to initial conditions (that are impossible to discretize reliably in areas with sharp ridges like the Alps) is a limiting condition that cannot be overcome by brute-force computational power.
For a simple illustration, run the the Baker’s problem for a couple of dozens of iterations, and you will see that it has reached the limits of the FP precision on your system. Double the FP precision by using quads instead of doubles, and all you will gain will be about 50% more steps before divergence — and not 100% more. The nature of the problem, not insufficient hardware, is the limiting factor.
I did not realize that those 2 Japanese firms had, combined, such a monopoly. Thanks for pointing this out.
The 5th Generation Computer, I remember very well, as the project came out exactly the same year as when I got my first computer (a ZX81)… Well, turns out that, if you look at where programers of my generations learned their skills from, the legacy of the ZX81 far exceeds that of the 5th Gen Computer… Another piece of nostalgia for those who grew up in the 80s is “Japan as Number One” — but you, Thorfinnsson, are certainly too young to remember that BS 😉
You mentioned Japanese programers being twice as productive than American ones. Please allow me to be dubitative here. I have met, worked with, worked under, and finally lead dozens of programer from the entire world in the past 25+ years. I think this is where, albeit superior in terms of IQ, the intelligence of Northeast Asians is lacking something that us hap-R1b people are having. Based on my own rankings of programing kick-ass-ness, with n>>1, Ashkenazim wipe the floor, followed closely by R1bs. And when I started to make those observations, a couple of decades ago, I did not even know what a haplogroup was, I was just classifying such people as “Western Europeans”.
Most people don’t live in mountainous regions.
Yes, so you need to quadruple brute force to solve the problem. The issue is that as far as I know, there’s no real workaround this problem, so either you put in the brute force (which is getting cheaper each year) or you must be willing to limit the accuracy and time range of your forecasts.
So this is definitely a problem where supercomputers are needed, at least as long as you need the forecast. (Which you do need, if you have agriculture or a devastating hurricane or something.) We’ll just throw more and more hardware at the problem, with ever diminishing returns, but with ever diminishing price for the hardware, it won’t be such a big problem.
I’m just old enough to remember the tail end of American anger in the early ’90s along with some Japonophobic Hollywood films like Showdown in Little Tokyo and Rising Sun from cable TV. My knowledge about business, technology, management, etc. from the era is from reading.
Here’s a good American country song from the period:
The famous American “white advocate” Jared Taylor grew up in Japan and was originally a computer journalist. Here is an article of his from 1985 about the strong likelihood that Japanese printer companies would likely destroy their American competitors: https://books.google.com/books?id=NFT964jBihIC&pg=PA7&dq=jared+taylor+printer&hl=en&sa=X&ved=0ahUKEwiYorP_ldfeAhXxqlkKHRYCA98Q6AEIKDAA#v=onepage&q&f=false (page 200 if the link doesn’t take you right to the article)
I have no idea whether or not Japanese programmers are that productive, but the journal articles I saw about this suggested it had to do with managerial techniques. As far as individual skills go it seems like Eastern Europeans tend to win various international programming competitions. Slavic supremacy??
Beyond 3 nm processors do you foresee a post silicon era in microprocessors like say processors having graphene transistors or something totally different like Nano mechanical computers in the late 2020s early 2030 timeframe?
One big issue with weather forecasts is that they also depend on the accuracy of the initial data. If your initial data is inaccurate, you will get increasingly inaccurate forecasts, too.
So throwing supercomputers at the problem is not enough. You also need to enhance the accuracy of the measured data. Fortunately, this is also increasing.
In short, you get increasingly accurate input data, and at the same time exponentially increasing brute computer capacity, which makes it feasible to increase the accuracy of your forecasts into the ever distant future.
I agree this is not an extremely important issue for which supercomputers are needed, but it is definitely a useful enough issue.
Fact, sure. If you define it so there are no important problems in engineering or the sciences. Keep digging.
Weather prediction has definitely gotten better where I am. It used to be the next day or next few days weather report was worthless but now you can basically trust it.
Regarding precision, your numerical analyst is intended to steer you away from ill-conditioned problems and such.
There are at this time no credible replacements for silicon, and personally I think it will take quite a while to spin up the routine production and use of such a replacement material should one be found. Very likely more than a decade, probably much longer. Recall that CMOS has been pushed quite far at this point and all parts of the production chain are highly sophisticated. Starting over with something exotic will require a huge effort.
Also all their little universities will have their own cluster now, as a result of domestic government priority in education and vast EU funding.
In this area, EU has designated Ireland as a center of excellence, so Ireland can also access special funding.
First the multinational corporations arrived, with massive recruitment of foreign workers.
Subsequently, Ireland (with EU assistance) invests in education, to increase skill level of local workers, so they can be recruited too.
Informatics and pharmeceuticals for Ireland, are like equivalent of oil and gas for Russia.
So for Irish young people today? cute 23 year old Irish girl, could be recruited as ui/ux designer for a corporation with multinational dimensions, for around $50,000. Qualification? Studied about post-colonialism in undergraduate, and a masters in science in “digital media”. Imagine in Russia, she would have a difficult recruitment process – to work as secretary.
People concentrated more on writing bug free programs. People were also more concerned with writing efficient code.
Computer programming (nobody called it a science) was taught at math departments. At some points these groups began to emancipate themselves into computer science departments. The trajectory in the Eastern Block was no different than in the West except that it was lagging behind.
Main frames were built in Russia and Poland based on IBM 360 and ICL 1905, respectively. Peripheral devices were built by all other countries, iirc, Bulgaria was doing printers, Czechoslovakia storage and so on. There were teletype terminals and then monitor terminals were introduced. When Reagan began to impose embargo on hi-tech in early 1980’s it slowed them down. Early microprocessor technology was developed in DDR and even Bulgaria where they were making Z80 and 8080. Some people in the US were prosecuted for exporting PC’s to the Soviet Block.
Success is Russian, not “Eastern European”.
Think about one of the world’s largest population of intelligent, nerdy guys, with middle class parents’ aspirations. Combine with “not ideal” career progression. A result was emerging competitive instincts for these games with some kids, like with certain girls for ice-skating or ballet.
Japan, also has a self-described “imaginary engineer” – the man of elite culture whose work guided by Buddhist texts. Below MORE tag
Chapter 16 of this book:
http://download1.libgen.io/ads.php?md5=2B7F945B67ACC3EEA143C1B258DDB770
I’d heard that. People using pencil only are forced to think, and thus make great software. I must have missed this miracle.
The problem was not just shortage of hardware, but also lack of textbooks and other learning materials. There were few lousy books, translated by someone who had no clue what it is all about. You really needed to experiment to find out the things, and there was nothing to experiment on.
Back then, typical programmers on those locally built mainframes were often unqualified people with some training. It was not a prestige job or fast money maker or even something exciting. They wrote simple data processing utilities, tools for accounting.
For an example of hardware shortages, Czechoslovak airplane manufacturer, huge company producing ~60% of jet trainers worldwide, managed to smuggle in embargoed VAX machine during the 1980’s, already obsolete and thrown away as scrap. For the manufacturer it was high-tech dream.
World Finals:
https://icpc.baylor.edu/worldfinals/results
Russia in 1st (MGU), 2nd (Phystec), 9th (St. Petersburg ITMO) and 13th place (Urals). Lithuania 12th place (Vilnius), Ukraine tied in 14th place (Lviv) with Russia (St. Petersburg State), Poland (Warsaw).
So in the top 13 this year there was one Lithuanian university.
And then of 17 teams which are getting equal score after the top 13, there is one Ukrainian and Polish university. To fulfill the top 30 – but how many Russian universities?
This is Russian victory (in a cool sport – although not that it is that useful, as in real life it’s not a race against a clock, not so much with clear definite problems, and slowest people are often better at the job).
I would advise youth in computer science, to spend the same time studying serious maths – but they’re always more likely to prefer anything in a competitive format.
Don’t forget that Japanese Softbank now owns ARM Holding.
Topic was an important debate in the 1970s:
Knuth (1974)
https://www.maa.org/sites/default/files/pdf/upload_library/22/Ford/DonaldKnuth.pdf
De Millo (1979) viewing significant differences
https://www.cs.umd.edu/~gasarch/BLOGPAPERS/social.pdf
I am reading your comments and they feel strange. Are they talking form your own experience? Have you been to Czechoslovakia in 1970s? You do not seem to know what you are talking about. Where does the tone of resentment come form? Like this one about the textbooks? People who were doing programming were usually from math, engineering and economic departments. Programming was chiefly used to solve some mathematical problems. They also were economists in central planning offices that were running all kinds of models on data they were collecting. They were using mostly Algol and Cobol while the scientific programming was done mostly with Fortran. What do you need textbooks for at this level? Programming simple for them then. You could figure things out by yourself. At that time Soviet Block had probably more math and engineering students per capita than the US. They after transitioning to programming could solve a lot of problems by themselves. Modern computer programmers often are undereducated in mathematics. Yes they are good in programming in the environment that got much more complex than in 1970s or 1980s. They are taught good programming habits and optimal structure. And most importantly they have learned the skill of using copy paste of subroutines they do not understand. In the past many programmers were writing these subroutines and they were capable of doing it. Nowadays for programmers this subroutines are just black boxes they have no clue how they work. This is understandable because the environment is more complex. Believe me in 1970s and 1980’s the human material that went to programming was of much higher quality than now. But the environment has changed. It also became more democratic and accessible to the masses. Programmer began to develop programming tools to let people who are not really programmers do what in the past programmers would do.
The Soviet Block was just lagging few years behind but similar processes were taking place there as in the West.
A financial investment rather than an industrial or technological investment.
I’m just waiting for the first video game system to come out that incorporates finite element analysis into the graphics engine, thus allowing for completely realistic destructible environments. That would be cool.
Alpha Zerois not a jump in power but rather a leap in efficiency and generalisable learning. Alpha Zero used only four of Google’s TPU1 chips, and can play chess at a superhuman level as well as Go whereas its predecessor used 48 TPUs and could only play Go.
ht
Closer and closer to something that might run detailed simulations of the future and work out an unbeatable strategy on its own account. Hassabis suggests the main safety measure will be an agreement for whichever AI research team began to make strides toward an artificial general intelligence to halt their project for a complete solution to the control problem prior to proceeding. Bostrom’s book points out that any such verge-of-success slowing by a lead project might be likely to motivate a lagging country to mount a safety last catch-up crash program, or even physical destruction of a rival country’s project.
Good links. Thanks!
My father tried, for a short time, to work as programmer in a computing center in the 1970’s. He didn’t have proper qualification (he studied agriculture), he didn’t like it and left. A relative of mine worked as programmer in the 1980’s, developing internal accounting tools for a large company. She used Cobol, Algol and Pascal. No math/engineering background, not even an university degree. Work boring as hell.
It is surely possible there were well qualified people developing exciting software, but I do not know about it. Military or top notch science could be different.
Back then, during the 1980’s, I got interested in computers. I couldn’t find literature, it was next to impossible to buy a micro. Other people may have been more lucky or have contacts abroad.
This may have been true, but it had low impact on spreading software development skills. Machines for personal use were not available (trust me, demand was high, supply almost nonexistent), there were no computer journals, no BBS system, existing computer related literature was of miserable quality and and printed in low numbers. No computer users meant no hobbyists writing software and trying to get professional.
Czechoslovakia completely missed micro-computer revolution. What I heard about 1980’s Yugoslavia, it was paradise in comparison.
Yes, there have been people who were really interested and were lucky to get their hand on the real thing. I knew some, and they were really dedicated and capable. However, this happened in spite of the system, which was defunct in this regard.
With no immediate feedback from the computer, it is hard to become a great coder.
Sorry but as somebody who started in the card punch era and who’s first job was using a Burroughs computer with its CANDE (Command and Edit) interactive system, it is the other way around. When you have to make sure you don’t have any syntax as well as logic errors, you make more of an effort to make sure your coding is right the first time. You spend a lot of time running through the logic and running test samples through your mind. The kids I see know just throw something together and keep running it and making changes as errors pop up.
I am not so sure. I was at UCLA in the early 80s when Patterson gave a lecture to our computer architecture class.
https://www.cnet.com/news/risc-chip-inventors-hennessy-patterson-win-computing-turing-prize/
I believe there was more than one RISC design back then.
How much of that is due to better information coming in. Better satelite images, dopler radar, and so on and not necesarily adding computing power?
Exactly.
This LH guy seems to be exceptionally angry with the deprivation he or his dad had to suffer in communist Czechoslovakia.
This is one of the coolest threads that I’ve read in years.
It’s all bought with money and does not represent Saudi society creating this growth. Entire research groups are just Chinese there. Also plenty of Americans and European researchers. This just proves that researchers will go where the money is and they don’t really care that much about ‘morals’, despite the fact that the academy is traditionally thought of as having left-wing (i.e. moral) values.
Well, there is intense competition for research positions in the West and very capable people who would 20 years ago for sure get a tenure-track job, now don’t get one. Then you have to face the fact that you can’t do the research you love and go into industry, or take advantage of these new types of university, with lots of money that try to raise the international profile of their country, even in a questionable country. If it means the end of your career or a continuation of it and closing your eyes on some questionable politics, then the vast majority chose the latter.
Of course there are plenty of researchers who do get a tenure-track position in the West under the current circumstances, that with the intense competition just cheated. You just have to witness the dumpster fire that is the CNRS, or French biochemical science in general currently, with their myriad scandals and photoshopped Western blots showing up all over PubPeer, from the biggest names in the field. So maybe if you’re a scientist it might be more moral if you think your research is important to give yourself a chance in Saudi Arabia, then lie to get a position in France, and then give false hope to people with cancer cures based on Photoshop and cheat legitimate scientists out of funding and careers.
You need both. With bad quality data you don’t even need to bother with computing power, because the results won’t be any good anyway.
Programs developed using punched cards, paper or film tapes were tiny. What was sufficient back them wouldn’t work now, for better or worse. Nostalgic flashbacks are nice, but not very relevant for today’s development.
Unlike the very smart commenters here, I am not able to judge coding skills across cultures, but I can compare how it was few decades ago versus now. Systemic problems of the olden days are over, fortunately. This doesn’t mean everyone will automatically turn into the programming god by now, but you do not have to deal with so many external obstacles when entering the field.
I also disagree with the widespread myth, that emphasis on math education in former communist states made their people distinctly better programmers. There’s more needed, and it was missing.
Oumuamua: “The name comes from Hawaiian oumuamua, meaning ‘scout'”
Name chosen for an object detected by a high tech computer controlled astronomical device comes from the language of a primitive stone age culture chosen only for the reason that they are non white people of color.
Picture of the White male from Canada who discovered the object:
Robert Weryk
Next interstellar asteroid to be discovered is planned to be named “Mumbojumbo”.
It’s a Cookbook!!!
Supercomputer count is very much a early 2000’s measure, similar to how the LHC is regarded in physics. It’s an outmoded approach and will be superseded by smarter application of new technologies.
A better but more difficult calculation would be one involving sophistication of integration of storage, FPGA, GPU and other specialized hardware.
You can wire as many cores together that you like but still get trounced with application specific hardware and software defined networking.
Curious.
Do you mean that the levels of abstraction today help with actual software development?
Keyword “abstraction”.
I know for fact that a lot of “programmers” today have no idea how computers actually do work.
Era of PDPs for example. Programs then were, true, a joke compared to programs today. But, a programmer did know how the thing actually works. From gate level up to processor interacting with peripherals.
Today, with additional levels of complexity (“cloud”) and “computer science” courses treating CPU as “black box” I am not quite sure today programmers are that good in general.
What’s the saying, that today’s programmers program IDEs…not actual hardware.
At the end of the day, it IS digital electronics really down there. A chip datasheet, for all practical purposes. And “programmers” today see a compiler as a “black box”. Don’t know anything about assemblers. Datasheet, except for embedded guys, doesn’t even exist.
Of course I am talking about average types here. There are always those savants. Never met one, though.
Russia develops mobile super-computer for defense industry
http://tass.com/science/1032180
By systemic problems I mean things like shortage of machines (in Czechoslovakia) and information being expensive and hard to find (everywhere before the age of internet).
I mostly agree with you about the sorry state of current software development.