banner letter Contact Us

Introduction

Navigating data highways with efficient decoding
Grasping petabytes of intricate data for informed decision-making Steering through the vast expanses of the automotive industry, our client is a vanguard in vehicle testing. Each test drive not only propelled their cars ahead but also amassed petabytes of intricate data. These nanosecond-level precision records paint each car’s every move, from the tire’s rotation to the windows opening and closing. But the roadblock? Decoding this data trove quickly and affordably.

The challenge

The road bumps of automotive data
Decoding specific data entities within 24 hours for post-test insights
Every vehicle test conducted by our client resulted in terabytes of data daily. Through Electronic Control Unit (ECU) communications, this data offered intricate insights across thousands of data points, such as the vehicle’s speed, movement, & more – the sheer volume of this information called for an efficient decoding method.
The client’s test drives generated several terabytes of data daily, adding to petabytes for multiple vehicles.The decoding of this enormous amount of data was costly and time-consuming. The client needed to retrieve specific data entities within 24 hours of the test runs, but they faced difficulties achieving that.

The high toll of processing costs & delays
The data collected by the client is stored in a standardized industry format called Measurement Data Format (MDF). This encoding format is designed for efficient and secure binary data storage. Decoding such a complex data structure, especially with the high precision required for the various signals, posed a significant challenge for the client.
The company incurred an operational cost of 1,000 euros per TB to process. Additionally, the delays in data processing led to longer detection times for anomalies/failures, which in turn led to increased operational expenses.

Solution

Shifting modes to scalable and cost-effective decoding
To address these challenges, Fractal used a tailored approach that harnessed the strengths of modern technologies and strategies to create a cost-effective, high performance decoding solution called DataLogger, which we implemented across various domains within the organization.
Our data engineering expertise and ability to quickly grasp domain-specific nuances resulted in an MVP that exceeded expectations. The solution was built to demonstrate the power of modernized scalable architecture leveraging the client-preferred Microsoft tech stack.

THE TRANSFORMATIVE SOLUTION
Fractal’s DataLogger decodes physical streams for actionable insights By decoding signal data into physical streams, Fractal’s Datalogger offers engineers insights into vehicle health, driver behavior, necessary pre-production corrections, and preemptive measures for electric vehicles, such as battery health assessments. This is achieved through a logical, nine-step process:
1. Deep Dive into Communication Protocols & Signal Recording
Input: Raw data (ECU communications, Binary Data formats, protocols, reference AutoSAR XML files)
Process: Thorough analysis of core communication protocols

2. Data Ingestion & Initial Processing
Input: Terabytes of vehicle test data
Process: Preliminary dry runs & preparation for extensive I/O operations

3.Big Data Integration and Auto-Scaling
Input: Varying incoming data
Process: Integration with Big Data tools; auto-scaling based on data load


4.MVP Development & Signal Conversion
Input: Test data from CAN Bus, several measurement data files.
Process: Initial conversion of signals from CAN Bus

5.Event-Driven Processing and Cloud Agnosticism
Input: Measurement data files from test drives.
Process: Use of Kubernetes’ event-driven architecture for efficient scaling and parallel processing.

6.Multithreaded Data Conversion & Writing
Input: Continous data stream.
Process: One thread reads and converts, & another writes the output to Delta Lake storage.

7.Core Logic Enhancement and Optimization
Input: Data ready for conversion.
Process: Transition of core conversion logic to C++ for optimization.

8.Operational Metadata Logging and Flexible Orchestration
Input: Runtime data operations.
Process:Use of serverless function APIs for logging; adaptable architecture for orchestration.

9.Output delivery
Output: Decoded data in human-readable format, ready for business integration.

Outcome

From 0 to 100 with ‘the In-house Converter’
A leap in cost efficiency from 1,000 euros to 25 euros per TB Performance Unleashed: With Fractal’s solution, 1 TB of data was processed in less than an hour, a task that previously took an entire day.
Cost Milestone: A massive leap in cost-efficiency was achieved, reducing the expense from 1,000 euros per TB to 25 euros per TB.
In-House Mastery: Freed from external licensing costs, the client proudly named their new tool, the In-House-Converter (IHC) – celebrated within IT & Business teams.