16
Mellanox ConnectX-5 Adapter June 2016 Paving the Road to Exascale

Announcing the Mellanox ConnectX-5 100G InfiniBand Adapter

Embed Size (px)

Citation preview

Page 1: Announcing the Mellanox ConnectX-5 100G InfiniBand Adapter

Mellanox ConnectX-5 AdapterJune 2016

Paving the Road to Exascale

Page 2: Announcing the Mellanox ConnectX-5 100G InfiniBand Adapter

© 2016 Mellanox Technologies 2

The Ever Growing Demand for Higher Performance

2000 202020102005

“Roadrunner”

1st

2015

Terascale Petascale Exascale

Single-Core to Many-CoreSMP to Clusters

Performance Development

Co-Design

HW SW

APP

HardwareSoftware

Application

The Interconnect is the Enabling Technology

Page 3: Announcing the Mellanox ConnectX-5 100G InfiniBand Adapter

© 2016 Mellanox Technologies 3

The Intelligent Interconnect to Enable Exascale Performance

CPU-Centric Co-Design

Work on The Data as it MovesEnables Performance and Scale

Must Wait for the DataCreates Performance Bottlenecks

Limited to Main CPU UsageResults in Performance Limitation

Creating SynergiesEnables Higher Performance and Scale

Page 4: Announcing the Mellanox ConnectX-5 100G InfiniBand Adapter

© 2016 Mellanox Technologies 4

Breaking the Performance Bottlenecks

� Today: Network device latencies are on the order of 100 nanoseconds

� Challenge: Enabling the next order of magnitude improvement in application performance

� Solution: Creating synergies between software and hardware – intelligent interconnect

Intelligent Interconnect Paves the Road to Exascale Performance

10 years ago

~10 microsecond

~100 microsecond

NetworkCommunicationFramework

Today

~10microsecond

CommunicationFramework

~0.1 microsecond

Network

~1microsecond

CommunicationFramework

Future

~0.05 microsecond

Co-DesignNetwork

Page 5: Announcing the Mellanox ConnectX-5 100G InfiniBand Adapter

© 2016 Mellanox Technologies 5

SC’15: Introducing Switch-IB 2 World’s First Smart Switch

Page 6: Announcing the Mellanox ConnectX-5 100G InfiniBand Adapter

© 2016 Mellanox Technologies 6

ISC’16: Introducing ConnectX-5 World’s Smartest Adapter

Page 7: Announcing the Mellanox ConnectX-5 100G InfiniBand Adapter

© 2016 Mellanox Technologies 7

ConnectX-5 EDR 100G Advantages

100Gb/s Throughput0.6usec Latency (end-to-end)200M messages per second2X higher versus competition

Highest Performance

Page 8: Announcing the Mellanox ConnectX-5 100G InfiniBand Adapter

© 2016 Mellanox Technologies 8

ConnectX-5 EDR 100G Advantages

MPI CollectivesMPI Tag MatchingIn-Network ComputingIn-Network Memory

Smart HPCAccelerations

Page 9: Announcing the Mellanox ConnectX-5 100G InfiniBand Adapter

© 2016 Mellanox Technologies 9

ConnectX-5 EDR 100G Advantages

PCIe Gen3 and Gen4Integrated PCIe SwitchAdvanced Dynamic RoutingFlexible Topology Design

Future Proofing

Page 10: Announcing the Mellanox ConnectX-5 100G InfiniBand Adapter

© 2016 Mellanox Technologies 10

ConnectX-5 EDR 100G Advantages

Virtual Switch / Router (OVS)

NVMe over Fabric offloads

RDMA over InfiniBand / Ethernet

Data CenterAccelerationsCloud, Storage, and more

Accelerated Switching and Packet Processing (ASAP2) Technology

Page 11: Announcing the Mellanox ConnectX-5 100G InfiniBand Adapter

© 2016 Mellanox Technologies 11

ConnectX-5 EDR 100G Architecture

X86 OpenPOWER GPU ARM FPGA

Offload Engines

RDMA Transport eSwitch & Routing

Memory

Multi-Host TechnologyPCIe Switch

InfiniBand / Ethernet10,20,40,50,56,100G

PCIe Gen4

PCIe Gen4

Page 12: Announcing the Mellanox ConnectX-5 100G InfiniBand Adapter

© 2016 Mellanox Technologies 12

Enabling Most Efficient and Scalable Compute and Storage Platforms

Cost-Effective, FlexibleMulti-Host

Host-ChainingPCIe Gen4

CloudOverlay Networks

RoCE (Lossy, Lossless, Overlay)Embedded Switch

StorageBackground Check-Pointing

NVMe over FabricRate Limiting

Embedded AppliancesSecurity, GPU, Network, etc.

BlueField SoCIntegrated PCIe Switch

Page 13: Announcing the Mellanox ConnectX-5 100G InfiniBand Adapter

© 2016 Mellanox Technologies 13

Highest-Performance 100Gb/s Interconnect Solutions

Transceivers

Active Optical and Copper Cables(10 / 25 / 40 / 50 / 56 / 100Gb/s)

VCSELs, Silicon Photonics and Copper

36 EDR (100Gb/s) Ports, <90ns Latency Throughput of 7.2Tb/s7.02 Billion msg/sec (195M msg/sec/port)

100Gb/s Adapter, 0.6us latency200 million messages per second(10 / 25 / 40 / 50 / 56 / 100Gb/s)

32 100GbE Ports, 64 25/50GbE Ports(10 / 25 / 40 / 50 / 100GbE)Throughput of 6.4Tb/s

Page 14: Announcing the Mellanox ConnectX-5 100G InfiniBand Adapter

© 2016 Mellanox Technologies 14

InfiniBand Delivers Highest Applications Performance

Performance Advantage of InfiniBand Increases with System Size

InfiniBand the Only Interconnect that Scales with System Size

InfiniBand Enables Higher System Performance with 50% of the Servers

Multiple Application Cases Demonstrate 35-63% Performance Advantage

Page 15: Announcing the Mellanox ConnectX-5 100G InfiniBand Adapter

© 2016 Mellanox Technologies 15

Mellanox Smart Interconnect Solutions

InfiniBand / EthernetInterconnect Solutions

Page 16: Announcing the Mellanox ConnectX-5 100G InfiniBand Adapter

Thank You