Super Computing News - Page 3

The latest and most important Super Computing news - Page 3.

Supercomputer built 8 million simulated universes in 3 weeks

Jak Connor | Mon, Aug 12 2019 7:17 AM CDT

Scientists are always looking for more updated and better ways to understand the universe we are currently living in and one of the best ways they can do that is through simulations.

Supercomputer built 8 million simulated universes in 3 weeks | TweakTown.com

According to a new announcement by researchers at the University of Arizona, the Ocelote supercomputer has managed to generate 8 million simulated universes for scientists to study and understand. These simulated universes are going to be directly compared to our actual cosmos, and through the comparison scientists hope to draw a better conclusion of the cosmic events that occurred while also filling in missing data points that are currently puzzling to theorists.

While 8 million simulated universes in just three weeks is certainly an achievement in itself, Ocelote didn't quite have the power to render these universes to every detail, as that would require an astronomical amount of computing power. Instead, Ocelote and the scientists created a system where the computer produced results that are a "sizeable chunk" of the observable universe. It should also be noted that each of the universes created were devised under a completely different set of rules, meaning that scientists have a lot of busy comparison work to do now.

Continue reading: Supercomputer built 8 million simulated universes in 3 weeks (full post)

AMD powers world's largest, most expensive supercomputer

Anthony Garreffa | Wed, May 8 2019 8:24 PM CDT

AMD has just scored a gigantic deal in that the company will power the next-gen fastest and most expensive supercomputer in the world, with the US Department of Energy buying a new AMD-powered custom supercomputer built by Cray called Frontier.

AMD powers world's largest, most expensive supercomputer | TweakTown.com

Frontier will come online in 2021 and is powered with super-fast EPYC processors and Radeon Instinct accelerators that will pump out an astonishing and record-breaking 1.5 exaflops of processing power. The system will be used for various tasks which will include performing advanced calculations in nuclear and climate research, simulating quantum computers, nuclear reactors, and more.

The new system will be delivered in late 2021 and turned on and cranking along in 2022 for the Oak Ridge National Laboratory in Tennessee. AMD has some huge bragging rights here as Frontier has as much processing power as the 160 fastest supercomputers combined -- yeah, combined.

Continue reading: AMD powers world's largest, most expensive supercomputer (full post)

Intel develops new tools to speed up quantum computer tech

Anthony Garreffa | Thu, Feb 28 2019 11:34 PM CST

Intel is wanting to boost the development of quantum computing technology, with the chipmaker unveiling its new Cryogenic Wafer Prober which allows researchers to test qubits on 300mm silicon wafers at super-low temperatures. Intel says this is the first quantum computing testing tool ever made, making it a very big deal.

Intel develops new tools to speed up quantum computer tech | TweakTown.com

Intel partnered up with Bluefors and Afore for the new cryoprober, with the new quantum computer testing tool made because during the development of Intel's own quantum computer, they worked out they needed a cryoprober to make it easier to test qubits in silicon before they're finalized and put into quantum chips and then sent off to customers. The company added that the cryoprober would allow the company to scale up manufacturing of silicon quantum computers, with less issues.

Quantum computers and their respective chips are normally tested for months and months in a super-low temperature dilution refrigerator where it would work out what works, and what doesn't. Normal transistors can be tested within an hour, versus months and months for quantum chips. The feedback from testing can be used to make tweaks that can be sent to manufacturing before the chips are made.

Continue reading: Intel develops new tools to speed up quantum computer tech (full post)

NVIDIA powers worlds fastest supercomputer: 200 petaflops

Anthony Garreffa | Mon, Jun 11 2018 8:29 PM CDT

The US is home again to the world's fastest supercomputer, with Summit making its debut at the Oak Ridge National Laboratory in Oak Ridge, Tennessee. Summit is powered by NVIDIA technology, which is how it has become the best in the supercomputer business.

NVIDIA powers worlds fastest supercomputer: 200 petaflops 05 | TweakTown.com

Inside of Summit you'll find an insane 27,648 of NVIDIA's super-fast Volta Tensor Core GPUs that is capable of 200 petaflops of computing power. Considering that the current supercomputer champion is China's Sunway TaihuLight that now only pushes 93 petaflops, Summit has truly climbed new supercomputing heights.

On top of the 27,648 Volta Tensor Core GPUs there's also 9,216 CPUs that get crammed into 5,600 square feet of cabinet space that is about the size of two tennis courts. The systems combined have an approximate weight of a commercial jet, and considering the 200 petaflops of power, this is an amazing technical achievement. Summit is capable of 3 exaops of AI, where if every single human being on Earth did 1 calculation per second, it would take 15 years... but on Summit it will take a single second.

Continue reading: NVIDIA powers worlds fastest supercomputer: 200 petaflops (full post)

Google's new TPU 3.0 revealed, REQUIRES liquid cooling

Anthony Garreffa | Thu, May 10 2018 12:11 AM CDT

Google has just blown the industry away with their new TPU 3.0, their next-gen custom-designed processor that is ridiculously over-powered to train machine learning systems.

Google's new TPU 3.0 revealed, REQUIRES liquid cooling | TweakTown.com

TPU 3.0 is 8x faster than its predecessor, with the first TPU being released in 2015, the company has made leaps and bounds. A pod of TPU 2.0s packed ASICs that featured 64GB of HBM that pumped out 2.4TB/sec, which is pretty insane. In comparison, the Radeon RX Vega 64 with 8GB of HBM2 is capable of 512GB/sec memory bandwidth.

Google should be the new AI chip champion with its TPU 3.0 ready for TensorFlow use, as well as a refined push into the cloud from Google. The new TPU 3.0 chips are so next-level that they require and use liquid cooling to keep them cool, but provide a huge 100 PFLOPs of machine learning power... crazy stuff.

Google didn't provide full hardware specifications of TPU 3.0 apart from it being 8x faster than TPU 2.0, so we'll have to wait a little while longer to see just what makes it 800% faster than its predecessor. I'm sure Google is using a new node process, HBM2, and much more to reach these lofty heights.

Continue reading: Google's new TPU 3.0 revealed, REQUIRES liquid cooling (full post)

Intel Nervana Neural Network Processor: 32GB HBM2 at 1TB/sec

Anthony Garreffa | Thu, Dec 7 2017 9:53 PM CST

Intel is hard at work on the research and development side of its upcoming Nervana Neural Network Processor, a new chip that will blow away any general-purpose processor for machine learning and AI applications.

Intel Nervana Neural Network Processor: 32GB HBM2 at 1TB/sec 03 | TweakTown.com

Vice President of Hardware for Intel's Artificial Intelligence Products Group, Carey Kloss, has provided an update to the work Intel has made on the NNP.

What does a neural network processor (NNP) have to do? In order to train a machine using neural networks needs a gigantic amount of memory and arithmetic operations in order to generate useful output. Then we step into the scaling capabilities, power consumption and maximum utilization being the cornerstones of Intel's Nervana.

Continue reading: Intel Nervana Neural Network Processor: 32GB HBM2 at 1TB/sec (full post)

Amazon goes 1984, uses cloud AI to translate/track people

Anthony Garreffa | Thu, Nov 30 2017 12:55 AM CST

During the recent Amazon Web Services re:Invent conference in Las Vegas on Wednesday, AWS boss Andy Jassy announced that the company will be enabling a suite of AI-powered tools. Jassy told the audience of over 40,000 people: "We have to solve the problem of making [AI] accessible for everyday developers and scientists".

Amazon goes 1984, uses cloud AI to translate/track people | TweakTown.com

These new cloud-based AI tools will be capable of measuring sentiment, tracking people in live feeds, translating languages, and much more. The list of things that the new AI enabled services AWS has is scary good, check them out:

  • The new Amazon Rekognition Video tool is able to recognize and track people in real-time video feeds, giving it certain advantages over video recognition tools from cloud rivals Google and Microsoft.
  • The Amazon Transcribe system can transcribe audio recordings of people speaking into clean text files.
  • The Amazon Comprehend service can pick up on positive or negative sentiment and certain people, places and phrases in text.
  • AWS also unveiled Amazon Translate, a service for translating text from one language into another, which Google has provided to developers for years. CNBC first reported that AWS was working on a translation tool in June.

Continue reading: Amazon goes 1984, uses cloud AI to translate/track people (full post)

Google is building its new AI research lab in Canada

Anthony Garreffa | Thu, Jul 6 2017 1:15 AM CDT

I thought that Skynet would open up in the US, but it looks like the Terminators will want some bacon and maple syrup instead, with Google announcing its new DeepMind AI research lab is open for... well... business I guess, in Canada.

Google is building its new AI research lab in Canada | TweakTown.com

DeepMind has announced that it's new AI research lab is opening up in Edmonton, Alberta later this month, with three University of Alberta computer science professors (Richard Sutton, Michael Bowling and Patrick Pilarski) leading the group. They will be joined by seven other AI leaders, too. The big question is: why isn't Google's new AI digs opening up on US soil? Recode reports that there are familiarity and political considerations, with over a dozen University of Alberta grads working at DeepMind, and Sutton was one of the first to join the AI lab as an advisor.

The Canadian government is also more willing to invest in AI research, with Canada cozying up to AI scientists to the tune of $125 million in funding - on top of existing funding. On US soil, the Trump administration is swaying away from scientific research, proposing major funding cuts.

Continue reading: Google is building its new AI research lab in Canada (full post)

Big Pharma is tapping AI for drug delivery process

Anthony Garreffa | Mon, Jul 3 2017 11:35 PM CDT

It appears that Skynet wants us all on Big Pharma drugs, with British pharmaceutical giant GlaxoSmithKline (GSK) looking to AI to design better, more efficient - and I'm sure, more profitable drugs.

Big Pharma is tapping AI for drug delivery process | TweakTown.com

GSK announced a new partnership with Exscientia, a British company that specializes in drug design. The two will work toegther to use Exscienta's AI-enabled platform to discover new, high-quality drug candidate-quality molecules. GSK has tasked Exscientia to work on 10 specific disease-related targets, and if they hit those targets, GSK will write a cheque for $43 million in research payments.

The partnership will see the companies tapping into the power of supercomputers and machine learning in order to see how new compounds will behave, and by speeding this process up with the help of crazy amounts of computing power aided by AI, it will save the company both time and money. Human researchers are nowhere near as efficient as AI and supercomputers working every second of every day with a billion things going on at once, which could mark a very big change for medicine.

Continue reading: Big Pharma is tapping AI for drug delivery process (full post)

US failing supercomputer ranks, falls to 1996 levels

Anthony Garreffa | Tue, Jun 20 2017 12:30 AM CDT

You'd think that the US would lead the supercomputer race, but it's China that is dominating right now. According to the latest TOP500 ranking of the world's most powerful supercomputers, the US has fallen so far behind it's at 1996 levels.

US failing supercomputer ranks, falls to 1996 levels | TweakTown.com

The list is updated twice a year, and determines the rankings of the supercomputer by overall computing power. China has the top two spots with its Sunway TaihuaLight pushing an incredible 93 petaflops, while the Tianhe-2 is capable of 33.9 petaflops. The US only has its Department of Energy's Titan supercomputer, with just 17.6 petaflops of computing power in comparison.

The newly upgraded Swiss National Supercomputing Centre has some power on the new list, pushing out 19.6 petaflops - beating the Titan in the US, up from its previous power of 9.8 petaflops. The US is no longer in the top three supercomputer rankings, falling into fourth place for the ifrst time in over 20 years.

But don't worry, the US Department of Energy is building an all-new IBM machine dubbed Summit, which will be pushing a mind blowing 200 petaflops. Summit will be online next year, offering double the performance of the #1 fastest supercomputer in the world.

Continue reading: US failing supercomputer ranks, falls to 1996 levels (full post)

Newsletter Subscription
Latest News
View More News
Latest Reviews
View More Reviews
Latest Articles
View More Articles