We're talking new releases & fast AI at Redis Released. Join us in your city.

Register now
Back to press releases

Redis Labs introduces Landmark Machine Learning Module for Redis: Redis-ML

October 04, 2016

Deliver instant predictive intelligence with the combination of Redis-ML and Spark ML

Mountain View, October 4, 2016—Today, Redis, the home of Redis, introduced an open source project Redis-ML, the Redis Module for Machine Learning that accelerates the delivery of real-time recommendations and predictions for interactive apps, in combination with Spark Machine Learning (Spark ML).

Machine learning is fast becoming a critical requirement for modern smart applications. Redis-ML accelerates the delivery of real-time predictive analytics for use cases such as fraud detection and risk evaluation in financial products, product or content recommendations for e-commerce applications, demand forecasting for manufacturing applications or sentiment analyses of customer engagements. Spark ML (previously MLlib) delivers proven machine learning libraries for classification and regression tasks. Combined with Redis-ML, applications can now deliver precise, re-usable machine learning models, faster and with lower execution latencies.

“The combination of Apache Spark and Redis simplifies and accelerates the implementation of predictive intelligence in modern applications,” said Ram Sriharsha, product manager for Apache Spark at Databricks. “This latest release from Redis is a great example of Spark’s growth and maturity in enterprise machine learning applications.”

“The Redis-ML module with Apache Spark, delivers lightning fast classifications with larger data sizes, in real-time and under heavy load, while allowing many applications developed in different languages to simultaneously utilize the same models,” states Dvir Volk, senior architect at Redis. “The Redis-ML module is a great demonstration of the power of Redis Modules API in supporting the cutting-edge needs of next generation applications.”

Redis-ML enriches Spark ML in the following areas:

  • Faster Prediction Generation: Storing and serving your trained Spark Machine Learning models directly from Redis, parallelizes access to the models and significantly improves performance. Initial benchmarks showed 5x to 10x latency improvement over the standard Spark solution in real time classifications.Redis-ML avoids the need to generate the model from file systems or other disk based data stores, a process which usually involves long serialization/deserialization overheads with slow disk accesses. With Redis-ML, at the end of the training phase, the model is just stored in its native format in Redis.
  • Consistent Prediction Delivery: As user traffic grows, it is important to guarantee real-time recommendations and predictions at a consistent speed to the end user. With Redis-ML, recommendations and predictions are delivered at consistent speed no matter how many concurrent users are accessing the model.
  • Greater Interoperability: Redis-ML provides great interoperability for all languages including Scala, Node, .Net, Python and more. With Redis ML, your models are no longer restricted to the language they were developed in, they can be accessed by applications written in different languages concurrently using the simple API.
  • Scaling Machine Learning Models: Delivering predictions with better precision requires larger machine learning models. Existing solutions cannot hold the model in-memory when it grows beyond the memory available in a single node. This immediately reduces performance and triggers the serialization/serialization to disk and performance suffers. The Redis-ML module takes full advantage of Redis’ in-memory distributed architecture to scale the database to any size needed in a fully automated manner without affecting performance.
  • Simplified Deployment: Once the models are ready, Redis-ML makes it easy to obtain recommendations or predictions for the application using simple APIs, without having to implement custom recommendation/prediction generation code or setting up a highly available and scalable infrastructure that supports it.
  • Higher Availability: Training new models can be done offline. However, reliably delivering real-time predictive intelligence is critical for modern applications. The Redis-ML module, deployed with Redis’ technology delivers always-on availability that protects against process, node, rack or data center failures with instant automatic detection and failover.

Join Redis at Big Data London on Nov. 4, 2016 to hear more about Redis Module for Machine Learning (Redis ML) with Spark Machine Learning (Spark ML).

The open source Redis Module for Machine Learning (Redis ML) is available at https://github.com/RedisLabs/spark-redis-ml.

Create additional modules to solve modern data challenges at the Redis Module Global Hackathon, registration for which is now open, with submissions concluding on Nov. 12th. The event is expected to bring together over 500 teams from around the world online and in the associated onsite hackathons in San Francisco and Tel Aviv. Participants will be eligible to win up to a total of $10,000 in cash prizes. Grand prize winners will be announced on Nov. 17th.

About Redis

Data is the lifeline of every business, and Redis helps organizations reimagine how fast they can process, analyze, make predictions, and take action on the data they generate. Redis provides a competitive edge to any business by delivering open source and enterprise-grade data platforms to power applications that drive real-time experiences at any scale. Developers rely on Redis to build performance, scalability, reliability, and security into their applications.

Born in the cloud-native era, Redis uniquely enables users to unify data across multi-cloud, hybrid and global applications to maximize business potential. Learn how Redis can give you this edge at redis.com.

SHARE