Dell HPC Community event at SC ’16

The Saturday before Supercomputing ‘16, I had the pleasure of participating in the Dell HPC Community event. As you might expect from Dell’s partnership with us, the cloud was a big focus this year. What I didn’t expect, however, was just how much it had everyone’s attention. Jay Boisseau, Chief HPC Technology Strategist at Dell, started the event off by saying interest in cloud technologies dominated the input from customers in the last six months. And Jim Ganthier, Sr. VP of Validated Solutions and HPC, was unequivocal about how Dell intends to respond. “We are going to make Cloud HPC a core Dell offering and competency.” The event unfolded unlike any prior Dell gathering I’d attended. Instead of customers (or Dell) talking about PowerEdge servers, the latest CPUs and GPUs, or anything around Linpack, the presentations all spoke to organizations trying new approaches to serving scientists and engineers with the help of cloud technologies. A common motivation emerged, as well: urgency to meet the demands of increasingly heterogeneous and data-driven workloads in more nimble and collaborative ways. Two of the more interesting presentations related to cloud were from Shawn Strande (San Diego Supercomputing Center) and Tommy Minyard (Texas Advanced Computing Center). They talked about their efforts to run various cloud technologies on locally-hosted cyberinfrastructure programs funded by the National Science Foundation. Each indicated cloudifying (my fake-word, not theirs) was putting meaningfully better control in the hands of researchers from their state and the nationwide XSEDE network, especially with the rise of NSF programs that see data streaming in from geographically distributed sensor and instrument networks in enormous volumes. But...