+
+(1) 408-943-8000
Save 50%
  • Mellanox Connect-IB™ Dual-Port FDR 56Gb/s InfiniBand Host Channel Adapter Card - Part ID: MCB194A-FCAT

Mellanox Connect-IB™ Dual-Port FDR 56Gb/s InfiniBand Host Channel Adapter Card - Part ID: MCB194A-FCAT

Connect-IB Host Channel Adapter, dual-port QSFP, FDR 56Gb/s, PCIe3.0 x16, tall bracket, RoHS R6
SKU:

MCB194A-FCAT

Weight:
20 lbs
Shipping:
Ships within 3-5 Days
Price:

$1,394.80

$697.80

Availability Call For Availability

Call us for special price 408 895 5000 x 1
FREE SHIPPING AVAILABLE
End Of Life: Please Call to Sales Representative for Alternate/Replacement part Only, (shop@ironnetworks.com) or Ph: 408-943-8000

Overview

Mellanox Connect-IB™ Dual-Port InfiniBand Host Channel Adapter Card

Connect-IB adapter cards provide the highest performing and most scalable interconnect solution for server and storage systems. High-Performance Computing, Web 2.0, Cloud, Big Data, Financial Services, Virtualized Data Centers and Storage applications will achieve significant performance improvements resulting in reduced completion time and lower cost per operation.

World Class Performance

Connect-IB delivers leading performance with maximum bandwidth, low latency, and computing efficiency for performance-driven server and storage applications. Maximum bandwidth is delivered across PCI Express 3.0 x16 and two ports of FDR InfiniBand, supplying more than 100Gb/s of throughput together with consistent low latency across all CPU cores. Connect-IB also enables PCI Express 2.0 x16 systems to take full advantage of FDR, delivering at least twice the bandwidth of existing PCIe 2.0 solutions.

Connect-IB offloads the CPU protocol processing and the data movement from the CPU to the interconnect, maximizing the CPU efficiency and accelerate parallel and data-intensive application performance. Connect-IB supports new data operations including noncontinuous memory transfers which eliminate unnecessary data copy operations and CPU overhead. Additional application acceleration is achieved with a 4X improvement in message rate compared with previous generations of InfiniBand cards.

Unlimited Scalability

The next level of scalability and performance requires a new generation of data and application accelerations. MellanoX Messaging (MXM) and Fabric Collective Accelerator (FCA) utilizing CORE-Direct™ technology accelerate MPI and PGAS communication performance, taking full advantage of Connect-IB enhanced capabilities. Furthermore, Connect-IB introduces an innovative transport service Dynamic Transport, to ensure unlimited scalability for clustering of servers and storage systems.

High Performance Storage

Storage nodes will see improved performance with the higher bandwidth FDR delivers, and standard block and file access protocols can leverage InfiniBand RDMA for even better performance. Connect-IB also supports hardware checking of T10 Data Integrity Field / Protection Information (DIF/PI) and other signature types, reducing the CPU overhead and accelerating the data to the application. Signature translation and handover are also done by the adapter, further reducing the load on the CPU. Consolidating compute and storage over FDR InfiniBand with Connect-IB achieves superior performance while reducing data center costs and complexities.

Software Support

All Mellanox adapter cards are supported by all Windows and Linux distributions. Connect-IB adapters support OpenFabrics-based RDMA protocols and software, and are compatible with configuration and management tools from OEMs and operating system vendors.

BENEFITS

  • World-class cluster, network, and storage performance
  • Guaranteed bandwidth and low-latency services
  • I/O consolidation
  • Virtualization acceleration
  • Power efficient
  • Scales to tens-of-thousands of nodes

KEY FEATURES

  • Greater than 100Gb/s over InfiniBand
  • Greater than 130M messages/sec
  • 1us MPI ping latency
  • PCI Express 3.0 x16
  • CPU offload of transport operations
  • Application offload
  • GPU communication acceleration
  • End-to-end internal data protection
  • End-to-end QoS and congestion control
  • Hardware-based I/O virtualization
  • RoHS-R6

Technical Specification

Feature Summary*

INFINIBAND
  • IBTA Specification 1.2.1 compliant
  • FDR 56Gb/s InfiniBand
  • Hardware-based congestion control
  • 16 million I/O channels
  • 256 to 4Kbyte MTU, 1Gbyte messages
 
ENHANCED INFINIBAND
  • Hardware-based reliable transport
  • Extended Reliable Connected transport
  • Dynamically Connected transport service
  • Signature-protected control objects
  • Collective operations offloads
  • GPU communication acceleration
  • Enhanced Atomic operations
 
STORAGE SUPPORT
  • T10-compliant DIF/PI support
  • Hardware-based data signature handovers
 
FLEXBOOT™ TECHNOLOGY
  • Remote boot over InfiniBand 
 
HARDWARE-BASED I/O VIRTUALIZATION
  • Single Root IOV**
  • Up to 16 physical functions, 256 virtual functions
  • Address translation and protection
  • Dedicated adapter resources
  • Multiple queues per virtual machine
  • Enhanced QoS for vNICs and vHCAs
  • VMware NetQueue support
 
PROTOCOL SUPPORT
  • OpenMPI, IBM PE, Intel MPI, OSU MPI (MVAPICH/2), Platforms MPI, UPC, Mellanox SHMEM
  • TCP/UDP, IPoIB, RDS
  • SRP, iSER, NFS RDMA, SMB Direct
  • uDAPL

COMPATIBILITY

PCI EXPRESS INTERFACE
  • PCI Express 2.0 or 3.0 compliant
  • Auto-negotiates to x16, x8, x4, x2, or x1
  • Support for MSI-X mechanisms
 
CONNECTIVITY
  • Interoperable with InfiniBand switches
  • Passive copper cable with ESD protection
  • Powered connectors for optical and active cable support
 
OPERATING SYSTEMS/DISTRIBUTIONS
  • Novell SLES, Red Hat Enterprise Linux (RHEL), and other Linux distributions
  • Microsoft Windows Server 2008/CCS 2003, HPC Server 2008
  • VMware ESX 5.1
  • OpenFabrics Enterprise Distribution (OFED)
  • OpenFabrics Windows Distribution (WinOF)

Manufacturer Warranty

1 Year

* This brief describes hardware features and capabilities. Please refer to the driver release notes on www.mellanox.com for feature availability.

** FutureSupport

Reviews

Write your own review

You're reviewing: Mellanox Connect-IB™ Dual-Port FDR 56Gb/s InfiniBand Host Channel Adapter Card - Part ID: MCB194A-FCAT

Customer Reviews

There are no reviews for this product yet.

Supermicro Partner
[profiler]
Memory usage: real: 22806528, emalloc: 22381344
Code ProfilerTimeCntEmallocRealMem