Screen Shot 2017-10-12 at 12.47.42 PM.png

Build a Real-Time Streaming Platform - Easier and Faster

Because performance matters when dealing with streaming data - in-memory processing and in-memory data is key. Keeping data in motion is important for the application performance. Organizations building real-time stream processing systems need to use an in- memory paradigm and be able to use any message broker and trust the platform to deliver each message exactly once. 

Today’s innovative businesses process ultra-fast data streams at their core - processing must be fault tolerant and continuously adapt to changing data flows. The ability to work with exactly once streams simplifies development and is even more important as real-time streaming spreads. 

Join Colin MacNaughton, Head of Engineering at Neeve Research to learn how to build, monitor, and manage ultra high performance, fault tolerant stream based systems faster and with extreme ease.

Chadev will be hosting the Chattanooga Developer Lunch on Thursday, October 19, from 12:00 AM to 1:00 PM.

About the Presenters:

Colin MacNaughton, Head of Engineering at Neeve Research, specializes in high performance In-Memory Computing and enterprise middleware.


Leveraging Messagin­g Platforms such as Kafka for Real-time Streaming Transaction

Unfortunately, most of us suffer from the complexity of the architectures required for real-time data processing at scale. Many technologies need to be stitched together, and each technology is often complex by itself. Usually, we end up with a strong discrepancy between how we, as engineers, would like to work vs. how we end up working in practice.

In this session, we will talk about how to radically simplify the architecture and speed up development time and application latency. We will cover how you can build applications to serve real-time processing needs without having to spend months building infrastructure, while still benefiting from properties such as guaranteed delivery, high scalability, distributed computing, and fault-tolerance. We will discuss use cases where stream processing often requires transactional and database-like functionality. Kafka (or any messaging broker) allows you to bridge the worlds of streams and databases when implementing core business applications (inventory management for large retailers, IoT based patient sensor monitoring in healthcare, fleet tracking in logistics, etc.), for example in the form of event-driven, containerized microservices.

Join us Thursday, October 19th, from 6:30 PM - 8:30 PM with Colin McNaughton, Head of Engineering at Neeve Research, and an author on several open source projects including Eagle, Robin, and Lumino. Colin will share experience, techniques, and best practices for building real-time applications and services using X Platform™, a powerful, easy-to-use library for developing highly scalable, fault-tolerant, distributed stream processing applications on top of Apache Kafka or other messaging brokers.

About the Presenters:

Colin MacNaughton, Head of Engineering at Neeve Research, specializes in high performance In-Memory Computing and enterprise middleware.

Screen Shot 2017-10-17 at 12.06.42 PM.png

IMC Summit 2017

The In-Memory Computing Summit Silicon Valley is October 24-25, 2017. An industry-wide event, it focuses on the full range of in-memory computing-related technologies and solutions. Conference attendees include technical decision makers, implementers and developers who make or influence purchasing decisions about in-memory computing, Big Data, Fast Data, IoT and HPC solutions. Held in San Francisco, the IMCS Silicon Valley is a meeting place for in-memory computing users.