Loading…
Back To Schedule
Thursday, July 11 • 11:55am - 12:15pm
NeuGraph: Parallel Deep Neural Network Computation on Large Graphs

Sign up or log in to save this to your schedule, view media, leave feedback and see who's attending!

Recent deep learning models have moved beyond low dimensional regular grids such as image, video, and speech, to high-dimensional graph-structured data, such as social networks, e-commerce user-item graphs, and knowledge graphs. This evolution has led to large graph-based neural network models that go beyond what existing deep learning frameworks or graph computing systems are designed for. We present NeuGraph, a new framework that bridges the graph and dataflow models to support efficient and scalable parallel neural network computation on graphs. NeuGraph introduces graph computation optimizations into the management of data partitioning, scheduling, and parallelism in dataflow-based deep learning frameworks. Our evaluation shows that, on small graphs that can fit in a single GPU, NeuGraph outperforms state-of-the-art implementations by a significant margin, while scaling to large real-world graphs that none of the existing frameworks can handle directly with GPUs.

Speakers
LM

Lingxiao Ma

Peking University
ZY

Zhi Yang

Peking University
YM

Youshan Miao

Microsoft Research
JX

Jilong Xue

Microsoft Research
MW

Ming Wu

Microsoft Research
LZ

Lidong Zhou

Microsoft Research
YD

Yafei Dai

Peking University


Thursday July 11, 2019 11:55am - 12:15pm PDT
USENIX ATC Track II: Grand Ballroom VII–IX