Computer Science

What Is High Performance Data and Computing (HPDC)?

What Is High Performance Data and Computing (HPDC)?
The term 'supercomputer' is misleading. High performance data and computing (HPDC or HPC) involves multiple computers—each called a node—pooling their resources, not a single mega-machine. Image from Pixabay
Eddie Huffman profile
Eddie Huffman October 25, 2022

High-performance computing (HPC) combines the resources of multiple computers to hypercharge problem-solving power and slash required computing time.

Computer Science Programs You Should Consider

Advertisement
Article continues here

In the 1951 movie No Highway in the Sky, Jimmy Stewart plays a scientist who sabotages an unsafe plane rather than let it take off and put passengers at risk. He knows about the plane’s fatal flaw because he had tested it for hundreds of hours in his laboratory.

Were that movie to be remade today, Stewart’s role would likely be replaced by a computer. That’s because rapid software simulations have replaced the kind of physical testing the Stewart character conducts. In developing its Overture airliner—a supersonic transport that promises to take passengers from New York to Paris in only 3.5 hours—Boom Supersonic used supercomputers to run 53 million compute hours on Amazon Web Services as a substitute for physical prototyping and wind-tunnel testing. The company plans to add more than 100 million more compute hours to that total before rolling out its finished product.

This type of high performance data and computing (HPDC) not only provides more detailed and more accurate information than physical testing but does so more efficiently. The Concorde, the original supersonic passenger jet, required 25 years of research and development and countless hours of real-world testing before making its first commercial flight in 1976. Overture should be airborne after a development period roughly half as long.

The same principles used to develop Overture apply to spaceflight. Today, NASA looks to supercomputer frameworks to help it resolve its next challenges. “Our current spaceflight computers were developed almost 30 years ago,” says Wesley Powell, NASA’s principal technologist for advanced avionics. “Future NASA missions demand significantly increased onboard computing capabilities and reliability. The new computing processor will provide the advances required in performance, fault tolerance, and flexibility to meet these future mission needs.”

High-speed computing systems provide solutions to a broad range of challenges in the air, beneath the sea, and on the ground. What is high performance data and computing (HMDC)? This article addresses that question. It also explores:

  • What are the benefits and drawbacks of high performance data and computing?
  • Jobs related to high performance data and computing

What Is high performance data and computing (HPDC)?

Just how super is a supercomputer? The term is misleading. High performance data and computing (HPDC or HPC) involves multiple computers—each called a node—pooling their resources, not a single mega-machine. These nodes are linked through a communications network and work together to solve large, complex problems in such fields as healthcare, engineering, science, and business. The computers may be in close proximity on premises or widely dispersed and accessed via the cloud.

The numbers and terms associated with HPDC are so astronomical they sound made up. The Yale Center for Research Computing, which opened in 2015, includes a cluster of computers called Ruddle used in genetic research. “Ruddle has roughly the same computational power as you would get by stringing together 2,000 or so high-end laptop computers,” according to Yale Medicine Magazine. “In addition, it possesses three petabytes (three quadrillion bytes) of digital storage. That’s a lot. If you were to fill Ruddle’s storage devices with MP3-encoded songs, you could play music for 6,000 years.” Talk about big data.

Meanwhile, over in Switzerland, researchers used HPC power to set a new record for calculating the number pi (π). They smashed the old record, adding 12.8 trillion new digits, bringing the new total to 62.8 trillion.

How is high performance data and computing used?

HPDC has revolutionized DNA sequencing, weather forecasting, vehicle design, finance, and other fields. Supercomputers can handle massive amounts of data and run different scenarios simultaneously. The global HPDC market will reach $44 billion in 2022, according to Hyperion Research. Almost every industry represented in the Fortune 1000 uses HPC applications.

Advertisement

“I'm Interested in Computer Science!”

“Typically the additional income from a master’s degree over a lifetime is worth the sticker price you pay for it.” (source)

A master’s in computer science can open countless doors from coast to coast. It will expand your knowledge and can help you advance your career, opening doors to management and leadership roles and increasing your earning potential. Jobs are plentiful around the country in a wide variety of industries, from healthcare to finance, entertainment to manufacturing.

University and Program Name Learn More

What are the benefits and drawbacks of high performance data and computing?

The power of high performance computing has spawned breakthroughs in many fields. The biggest drawback appears to be its cost, both in terms of money and the labor it takes to set up an HPDC system. Those issues have made cloud computing alternatives increasingly popular.

Speed and capacity

Modern computers are no slouch when it comes to processing data, with the average desktop machine capable of performing billions of calculations per second. HPC leaves the average PC in the dust, however, doing quadrillions of calculations per second. If a single computer can solve a problem at all, it might take hours, days, weeks, months, or years to do so. A supercomputer can knock that time down to minutes, hours, days, or weeks.

A pricey approach

Oracle cites multiple challenges to locally hosted HPDC deployments:

  • Buying and upgrading computing equipment
  • Management and operational costs
  • Long waiting lists for users to run an HPDC workload
  • Delayed hardware upgrades because of long purchasing cycles

All of which explains the increasing move to cloud-based supercomputing. “Instead of having to architect, engineer, and build a supercomputer, companies can now rent hours on the cloud, making it possible bring tremendous computational power to bear on R&D,” Joris Poort, founder and CEO of intelligent cloud supercomputing platform Rescale, wrote in the Harvard Business Review.

Reduced physical testing

Computer simulations accelerate the speed of Boom Supersonic getting its jets in the air. Poort cites other industries using technology to a similar end, including Commonwealth Fusion Systems, a fusion nuclear reactor startup that “relies on simulations to validate potential reactor designs, as no commercial fusion reactor has ever existed.” Machine learning algorithms train software for advanced driver assistance systems.

Firefly Aerospace uses computational engineering to develop commercial rockets it plans to send to the moon. “Similarly, drug manufacturers need complex simulations to know how molecules will interact with a biological environment before they can commit to producing new drug discovery breakthroughs,” Poort writes.

Better weather warnings

Weather forecasting models use data analytics to predict everything from local thunderstorms to hurricanes that threaten multiple states. The U.S. Global Forecast System hopes to launch a new hurricane forecast model, the Hurricane Analysis and Forecast System, in time for the 2023 storm season. “Researchers are developing new ensemble-based forecast models at record speed, and now we have the computing power needed to implement many of these substantial advancements to improve weather and climate prediction,” says Ken Graham, director of the National Weather Service.

Solving healthcare mysteries

Remember the genetic researchers at Yale with three petabytes of digital storage? Those scientists and others like them have made great strides in gene sequencing, diagnosis of mysterious disorders, and new and improved treatment methods. “Having a specific genetic diagnosis can help doctors and families have a better understanding of treatments, outcomes, and predictions,” says Lauren Jeffries, clinical genetics coordinator for Yale Medicine’s Pediatric Genomics Discovery Program.

Jobs related to high performance data and computing

The spread of HPDC across a wide range of businesses, industries, and government agencies has opened up a similarly wide range of job opportunities. HPDC-related jobs tend to fall at the high end of the IT scale. They include:

What degree do you need to work on high performance data and computing?

A degree in computer science or engineering will give you a solid foundation for HPDC work. An advanced degree can give you a competitive edge in terms of job opportunities and higher salaries. For example, Case Western Reserve University and Southern Methodist University both offer master’s degrees in computer science. Some schools, including the University of Houston and Temple University, offer HPDC certificate programs.

Questions or feedback? Email editor@noodle.com

About the Author

Eddie Huffman is the author of John Prine: In Spite of Himself and a forthcoming biography of Doc Watson. He has written for Rolling Stone, the New York Times, Utne Reader, All Music Guide, Goldmine, the Virgin Islands Source, and many other publications.

About the Editor

Tom Meltzer spent over 20 years writing and teaching for The Princeton Review, where he was lead author of the company's popular guide to colleges, before joining Noodle.

To learn more about our editorial standards, you can click here.


Share

Computer Science Programs You Should Consider

Advertisement

You May Also Like To Read


Categorized as: Computer ScienceInformation Technology & Engineering