- CheckMates
- :
- Products
- :
- Quantum
- :
- Security Gateways
- :
- Re: Lab deployment Open Servers
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Mute
- Printer Friendly Page
Are you a member of CheckMates?
×- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Lab deployment Open Servers
Running a lab in VMWare workstation on a beefy laptop or desktop is okay I guess for some learning. I am looking at setting up a lab for some more full-featured testing.
Currently I am looking at a couple of Protectli vaults (4 port or 6 port) or similar for the open-server gateways and a NUC or similar for the open-server management. I'm not planning on HTTPS inspection, rather application control, threat prevention work, and dynamic routing.
Does anyone have suggestions or "gotchas" to look out for on this path?
Thanks in advance
Accepted Solutions
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Most network cards will work. For 1g, go for Intel e1000 cards. For 10g, Intel ixgbe or Mellanox ConnextX-3 and up should be fine. For 25g, 40g, 50g, or 100g, go for Mellanox ConnectX-5.
Storage controllers will be a limitation. I would not expect the Protectli Vault's storage controllers to have drivers in GAiA. Check Point's 3100 and 3200 boxes use Atom C2000-series processors with integrated storage controllers. The 3600 and 3800 use Atom C3000-series processors. Something based on either of those lines should work well, and would allow a lot of ECC RAM. The Xeon D-1500 and D-2100 lines use the same storage controllers if you want a little more processor performance.
Be aware a lot of Atom C2000 models are subject to a hardware bug called AVR54 which will eventually render them unbootable. It's fixed in some models of C2000, and not present in any model of C3000 or Xeon. If you can, I'd go for a C3000 or Xeon D-1500. These chips are also good at virtualization (and they support *tons* of RAM) if you later decide you don't need dedicated hardware just for firewalls.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Namely if the hardware isn't listed on the HCL there is a chance that you will have issues with drivers and we can't garuntee that it will work.
https://www.checkpoint.com/support-services/hcl/
Running it over VMware may help workaround that scenario if it occurs.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
For hardware NICs stick with Intel and Mellanox/Nvidia as these cards will support Multi-Queue and give good performance. The only exception is Intel cards that use the e1000 driver as these will not support Multi-Queue; if in VMware use the vmxnet3 driver instead. I use vmxnet3 in my VMWare Workstation training environment and it works very well.
Do not attempt to use hardware Broadcom or Emulex NICs as they are absolutely terrible; if you are unfortunate enough to have these as the builtin onboard interfaces, disable them right in the server BIOS so they are completely out of the picture. Even though they are not configured and not even plugged in, I've seen these crappy onboards screw things up royally. Ironically the older and more established the hardware NIC card the more likely Gaia will have a driver for it; you may run into Gaia driver availability issues with the newest shiniest NICs.
March 27th with sessions for both the EMEA and Americas time zones
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Thank you for the great information. Bob wins the accepted solution on detail.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
You can mark multiple replies as solutions. 😉
Separately, I don't think I was aware the e1000 cards don't do multi-queue. I use and recommend them because they're widely available and basically free in low-profile quad-port form. I have a box of about 15 such cards. Not sure about specific network interface models to look for if you need to try out multi-queue.
- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Most network cards will work. For 1g, go for Intel e1000 cards. For 10g, Intel ixgbe or Mellanox ConnextX-3 and up should be fine. For 25g, 40g, 50g, or 100g, go for Mellanox ConnectX-5.
Storage controllers will be a limitation. I would not expect the Protectli Vault's storage controllers to have drivers in GAiA. Check Point's 3100 and 3200 boxes use Atom C2000-series processors with integrated storage controllers. The 3600 and 3800 use Atom C3000-series processors. Something based on either of those lines should work well, and would allow a lot of ECC RAM. The Xeon D-1500 and D-2100 lines use the same storage controllers if you want a little more processor performance.
Be aware a lot of Atom C2000 models are subject to a hardware bug called AVR54 which will eventually render them unbootable. It's fixed in some models of C2000, and not present in any model of C3000 or Xeon. If you can, I'd go for a C3000 or Xeon D-1500. These chips are also good at virtualization (and they support *tons* of RAM) if you later decide you don't need dedicated hardware just for firewalls.
