Equipped with processing capabilities such as CPUs, network IO, storage and AI accelerators, Lanner edge AI servers perform AI inference and real-time analytics at the edge and find applications in various domains, including network security, automation, computer vision and autonomous driving. By processing data at the edge, they empower these industries with real-time intelligence, enhanced decision-making capabilities, and reduced bandwidth usage.
Deep Learning Inference Server
Intel® Xeon D-2100
Short Depth Chassis
Wide Temperature
1U 19” Rackmount Open RAN Appliance with Intel Xeon® D-2100 Multi-core Processor (Codenamed Skylake-DE)
Intel® Xeon D-2100 8/12/14/16 Cores Processor
Short Depth Chassis and Wide Operating Temperature -40~65ºC
2x DDR4 2667 MHz REG, ECC RDIMM, Max. 64GB
Front Access I/O with 1x GbE RJ45 for IPMI, 8x 10G SFP+, 1x RJ45 Console, 1x USB 3.0 and Screw-less Fan Replacement
4x 2.5” Internal HDD/SSD Bays, 1x M.2 NVMe 2280 M key
1x PCI-E x16 slot for FPGA or GPU cards (Max. up to 75W)
Intel® QuickAssist Technology
Get a Quote
Deep Learning Inference Server
Intel® Xeon D-2100
Short Depth Chassis
Wide Temperature
Short Depth Chassis Edge Computing Appliance with Intel Xeon® D-2100 Multi-core Processor (Codenamed Skylake-DE)
Intel® Xeon D-2100 12/16 Cores Processor
Wide Operating Temperature -40~65ºC
2x DDR4 2667 MHz REG, ECC RDIMM, Max. 64GB
Front Access I/O: 1x GbE RJ45 IPMI, 8x 10G SFP+, 2x 40G QSFP+, 1x RJ45 Console, 1x USB 3.0 and Front Fan Replacement
2x 2.5” Internal HDD/SSD Bays, 2x M.2 NVMe 2280 M key
1x PCI-E*16 FH/HL slot for FPGA or GPU cards
Optional G.8272 T-GM Compliant IEEE 1588v2 SynE, onboard GPS
Get a Quote
Deep Learning Inference Server
Intel® Xeon D-2800/2700
DDR4 Max. 256GB
8x 10G SFP+, 2x 25G SFP28
NEW
High Performance Edge Computing Appliance With Intel Xeon® D-2800 Series Multi-core Processor (Codenamed Eddy Lake D)
Intel® Xeon® D2700/2800 8~22 Cores
4x DDR4 3200/2933MHz REG RDIMM, Max. 256GB
8x 10G SFP+, 2x 25G SFP28, 2x GbE RJ45, 2x USB 3.0
1x RJ45 Console, 2x 2280 M.2 (1x NVMe & 1x SATA)
-40C~65C Operating Temperature (SKU B/C/E)
1x OCP 3.0, 1x PCIe*16
Get a Quote
Deep Learning Inference Server
Intel Xeon (Emerald Rapids)
OCP 3.0 NIC, TPM
3x PCIe Slots
NEW
5G Edge Server With The 5th Gen Intel® Xeon® Scalable Processors (Codenamed Emerald Rapids)
5th Gen Intel® Xeon® Scalable Processors
Intel vRAN Boost Support (For 4th Gen Intel® Xeon® Scalable Processors Only)
Short Depth Chassis and Front I/O Design
16x DDR5 4400MHz RDIMM, Max. 1024GB
1x OCP 3.0 NIC Module
0 ~ 50ºC Operating Temperature (By CPU SKU)
2x M.2 NVMe 2280, 2x 2.5’’ SATA/ U.2
1x FHFL PCIex 16 slot, 2xLP or 1xFHHL slot (PCIex8)
Secure BMC / TPM 2.0
A NVIDIA-Certified System for industrial edge
Get a Quote
Deep Learning Inference Server
NVIDIA MGX Architecture
NVIDIA Grace™ Hopper CPU
Intel® Xeon®6 Processor
Preliminary
2U 19” Modular Edge AI Server Platform Based On NVIDIA MGX Architecture
NVIDIA Grace™ Hopper CPU Or Intel® Xeon®6 Processor
8x DDR5 6400MHz RDIMM, Max. 1536GB System Memory
1x GbE RJ45, 1x RJ45 Console, 1x USB 3.0
2x M.2 NVMe (PCIe), 2x Or 1x PCIe*16 FH 3/4L (By SKU), 1x PCIe*16 FHHL Or 1x PCIe*16 LP
6x Smart Fans, 1600W AC CRPS PSU
Get a Quote
Deep Learning Inference Server
2x Intel Xeon Cascade Lake
12x 3.5” Drive Bays
Max. 768GB Memory
NEW
2U High Performance x86 Hyper-converged appliance with 12 x 3.5” Storage Bays
2U High Performance Hyper-converged Appliance
Support Dual 2nd Gen Intel® Xeon® Scalable SP processor family up to 205w and max. up to 24x DDR4 R-DIMM
Front: 12 x 3.5”HDD SATA 6G (Default) , SAS/2 NVME (Optional)
Rear: 2 x 2.5” SATA 6G
Console, LOM port, MGT port, 2x USB 3.0 ports, RJ45, SFP+ ports
2x PCI-E*16 FH 10.5”L (Max. 266.7mm) + 1x PCI-E *8 HH/HL
1+1 redundant power supply
A NVIDIA-Certified System for enterprise edge
Get a Quote
Deep Learning Inference Server
2x Intel® Xeon®6 Processor
288 E-cores
1536GB DDR5
Preliminary
2U 19” Rackmount Network Security Appliance Built With Dual Intel® Xeon®6 Processor
Dual Intel® Xeon®6 Processor With Up To 288 E-cores
16x RAM Max. 1536GB DDR5 6400 MT/s REG DIMM Or MCR DIMM 8000 MT/s
8x Or 4x NIC Slots, 2x GbE RJ45, 1x RJ45 Console, 1x LOM, 2x USB 3.0
2x U.2 Hot-swappable NVMe Or 2.5” SATA, 2x M.2 NVMe
2x FHHL Or 1x FH3/4L GPU Card Support
4x Individual Hot-swappable Fans, 1+1 Redundant PSUs
Get a Quote
Deep Learning Inference Server
2x Intel Xeon (Ice Lake SP)
Max. 1536GB Memory
8x NIC, 2x RJ45, 1x Console
2U 19" Rackmount Network Appliance Built with 3rd Gen Intel® Xeon® Scalable Processor (Codenamed Ice Lake SP)
Dual 3rd Gen Intel® Xeon® Scalable Processor (Ice Lake SP)
24x DDR4 2133/2400/2666/2933/3200 MHz, Max. 1536GB
8x NIC Slots, 2x GbE RJ45, 1x RJ45 Console, 1x LOM, 2x USB 3.0
PCIe*16 Gen 4 Expansion (Optional), 3x M.2-2280 (NVME & SATA)
100G Intel® QAT, Intel SGX, Intel Boot Guard, TPM 2.0
4x Individual Hot-swappable Fans, 1300W/2000W 1+1 ATX
Redundant PSUs
Get a Quote
Deep Learning Inference Server
2x Intel Xeon (Emerald Rapids)
Max. 8x NIC
Max. 12x NVME HDD
NEW
2U 19” Rackmount Network Appliance Built With The 5th Gen Intel® Xeon® Scalable Processors
Dual 5th Gen Intel® Xeon® Scalable Processors (Emerald Rapids)
24x 288-pin DDR5 4800MHz R-DIMM, Max. 1536GB
8x NIC Slots, 2x GbE RJ45, 1x RJ45 Console, 1x LOM, 2x USB 3.0
2x 2.5” HDD/SSD (SKU A & C)
6x Individual Hot-swappable Fans, 1600W/2000W 1+1 ATX Redundant PSUs
Intel® QuickAssist Technology
Get a Quote