Gary Grider and Brad Settlemyer

Handling Trillions of Supercomputer Files Just Got Simpler

April 2, 2019
Delta FS, an Exascale file system, breaks the “the metadata bottleneck” by handling extreme numbers of files and amounts of data with unprecedented performance.

A new distributed file system for high-performance computing, Delta FS, was developed by computer scientists at Los Alamos National Laboratory and Carnegie Mellon University and will be distributed via Git Hub, a software collaboration site. It is predicted Delta FS will vastly improve the tasks of creating, updating, and managing extreme numbers of files.

“We designed it to enable the creation of trillions of files,” says Brad Settlemyer, a Los Alamos computer scientist and project leader. “Such a tool helps researchers solve classical problems in high-performance computing, such as particle-trajectory tracking or vortex detection.”

DeltaFS builds a file system that appears to the user just like any other file system, doesn’t require specialized hardware, and is specifically tailored to assisting scientists and engineers in making new discoveries when using high-performance computers.

“One of the foremost challenges, and primary goals of DeltaFS, was getting it to work across thousands of servers without requiring a portion of each be dedicated to the file system,” says George Amvrosiadis, assistant research professor at Carnegie Mellon University. “This frees administrators from having to decide how to allocate resources for the file system, which will become a necessity when exascale machines become a reality.”

The file system brings about two important changes in high-performance computing. First, DeltaFS makes it possible to deploy new strategies for designing supercomputers, dramatically lowering the cost of creating and managing files. In addition, DeltaFS radically improves the performance when executing highly selective queries, reducing the time needed to conduct scientific discovery.

DeltaFS is a transient, software-defined service that lets data be accessed from a handful or hundreds of thousands of computers based on the user’s performance requirements.

“Storage techniques used in DeltaFS apply in many scientific domains, but alleviating the metadata bottleneck we have shown a way for designing and procuring much more efficient HPC storage,” Settlemyer says.

Sponsored Recommendations

Customizations to Get Standard Motors to Mars

Jan. 10, 2025
Clearly, the Martian environment can be harsh and unaccommodating to systems made to operate on Earth. Through a combination of standard industrial motors and creative collaboration...

No Access for Bacteria: An Inside Look at Maxon's Cleanroom

Jan. 10, 2025
Tiny drive systems for use in the human body have to be built in a clean environment, free of microbiological contamination. Welcome to the GMP cleanroom of maxon, where discipline...

High-Efficiency, Precision Drive Systems for Every Robot

Jan. 10, 2025
Robots assemble devices, explore space, and perform surgeries. To achieve human-like motion and accuracy they need powerful and highly precise drives. Learn about custom-made ...

The Importance of Motors in Transportation

Jan. 10, 2025
As we progress toward more efficient and automated systems, the need for robust and reliable motors in the transportation industry has become more critical than ever. Explore ...

Voice your opinion!

To join the conversation, and become an exclusive member of Machine Design, create an account today!