Deep Learning Inference Server
- 2x Intel Xeon Cascade Lake
- 12x 3.5” Drive Bays
- Max. 768GB Memory
NEW
2U High Performance x86 Hyper-converged appliance with 12 x 3.5” Storage Bays
- 2U High Performance Hyper-converged Appliance
- Support Dual 2nd Gen Intel® Xeon® Scalable SP processor family up to 205w and max. up to 24x DDR4 R-DIMM
- Front: 12 x 3.5”HDD SATA 6G (Default) , SAS/2 NVME (Optional)
- Rear: 2 x 2.5” SATA 6G
- Console, LOM port, MGT port, 2x USB 3.0 ports, RJ45, SFP+ ports
- 2x PCI-E*16 FH 10.5”L (Max. 266.7mm) + 1x PCI-E *8 HH/HL
- 1+1 redundant power supply
- A NVIDIA-Certified System for enterprise edge
Deep Learning Inference Server
- Intel® Xeon D-2100
- Short Depth Chassis
- Wide Temperature
NEW
Short Depth Chassis Edge Computing Appliance with Intel Xeon® D-2100 Multi-core Processor (Codenamed Skylake-DE)
- Intel® Xeon D-2100 12/16 Cores Processor
- Wide Operating Temperature -40~65ºC
- 2x DDR4 2667 MHz REG, ECC RDIMM, Max. 64GB
- Front Access I/O: 1x GbE RJ45 IPMI, 8x 10G SFP+, 2x 40G QSFP+, 1x RJ45 Console, 1x USB 3.0 and Front Fan Replacement
- 2x 2.5” Internal HDD/SSD Bays, 2x M.2 NVMe 2280 M key
- 1x PCI-E*16 FH/HL slot for FPGA or GPU cards
- Optional G.8272 T-GM Compliant IEEE 1588v2 SynE, onboard GPS
Deep Learning Inference Server
- Intel® Xeon D-2100
- Short Depth Chassis
- Wide Temperature
NEW
1U 19” Rackmount Open RAN Appliance with Intel Xeon® D-2100 Multi-core Processor (Codenamed Skylake-DE)
- Intel® Xeon D-2100 8/12/14/16 Cores Processor
- Short Depth Chassis and Wide Operating Temperature -40~65ºC
- 2x DDR4 2667 MHz REG, ECC RDIMM, Max. 64GB
- Front Access I/O with 1x GbE RJ45 for IPMI, 8x 10G SFP+, 1x RJ45 Console, 1x USB 3.0 and Screw-less Fan Replacement
- 4x 2.5” Internal HDD/SSD Bays, 1x M.2 NVMe 2280 M key
- 1x PCI-E x16 slot for FPGA or GPU cards (Max. up to 75W)
- Intel® QuickAssist Technology