ACCELERATED BIG DATA ANALYTICS
PRE-INSTALLED, OPTIMIZED AND TESTED
NEXT-GENERATION IN-MEMORY ANALYTICS APPLIANCE
Make Smarter Decisions Faster.
NumaQ is in-memory analytics taken to the extreme. You can load and run your 32TB of HOT data in-memory, in near real-time. NumaQ enables companies to make decisions in seconds and minutes, instead of hours or days. Built on a proven massively parallel, in-memory architecture, NumaQ accelerates open source analytics faster than legacy disks based systems.
Terabytes of memory, thousand of cores, ONE server to manage.
Accelerate your analytics by running it in-memory.
10x Price-Performance compared to legacy disk-based analytics systems.
OPEN SOURCE ANALYTICS. ACCELERATED INSIGHTS.
NumaQ comes pre-installed and integrated with a suite of pre-tested, pre-integrated open source analytics software. Available immediately is NumaQ Spark and R appliances where NumaQ unique scalable in-memory architecture is matched to Spark’s in-memory analytics engine, and to R’s hunger for memory. To further promote the use of open source analytics, the NumaQ team is working closely with the community on tighter integration of Spark and R. See [github url to go here]
SCALE OUT TO SCALE UP
NumaQ easily scales to thousands of cores and terabytes of memory for data and memory intensive workloads. NumaQ’s unique scale-out to scale-up architecture allows the customer to easily add memory, cores and storage by simply adding more NumaQ nodes to an existing system. This allows the customer to minimize risks, stay lean and scale their analytics infrastructure based on their current workloads and scale only when required.
ZERO SETUP. FULLY SUPPORTED.
Start computing once your NumaQ appliance is powered up. Save weeks of your time to download, setup, test and tune your analytics software and hardware servers and networking. NumaQ is out-of-the-box analytics. NumaQ and our network of partners will support the appliance end to end, from the server to the analytics software stack. Your single source of support and expertise.