For many organisations, dealing with an existing system that is struggling to meet the demands of today’s data-driven world is a common problem. Our client, a large international investment bank, was no exception. Projective Group has delivered several projects for them using Snowflake to help make their data infrastructure more modern, cost effective, and faster.
Change is the only constant – that’s true in life and in business. Many organisations struggle to keep up with changes, or rather, their infrastructure does. Whether it’s systems that fail due to load, systems that are expensive to run or expensive to change, or all of the above: no organisation is immune. Steve Jenkings, a seasoned data professional at Projective Group, has successfully completed several Data Engineering projects for this client and shares some of his success stories with us.
Proof of concept
Steve and his team take a methodical approach to each project, typically starting with an analysis of the existing capability before proposing architectural options for more robust modern alternatives. They agree success criteria by which the team can judge their efforts (e.g. report execution performance, concurrent query execution capability) and work to deliver an end-to-end prototype as quickly as possible. Once a successful proof of concept is delivered, the client frequently asks the team to move to deliver the complete project. “There are always existing systems with problems that can be replaced with a modern, faster, and more cost-effective Snowflake-based solution,” says Steve.
Not just copy and paste
Many of the projects we deliver are migrations from legacy technology to new technology and, contrary to popular belief, these migrations involve much more than just copying what’s there and pasting it somewhere else. “First you have to understand what’s there. You have to break it down, work out which pieces are or are not required in the new solution, then determine what each one does and should look like once migrated. During a migration we have a lot of opportunities to fix past problems, to change the data model and rewrite code to remove and simplify the logic, and then to optimise it both in general and specifically for Snowflake. We never copy and paste code from one system to another, we re-engineer it to make the best use of the target platform we’re deploying on,” Steve explains.
During a migration project, we never just copy and paste code from one system to another. We re-engineer everything to make the best use of the target platform.
Cheaper, faster, and more efficient
The results of this meticulous approach are tangible and the benefits go beyond mere cost savings. As more and more operational data is moved into Snowflake, and with the right attention to detail, ‘data products’ can be created and lay the foundations for a possible ‘data mesh’. Ultimately, other organisational use cases can benefit by leveraging this data and processes rather than having to start from scratch, eliminating redundancy and improving productivity.
One of the most tangible results has been a reduction in the time taken to produce reports. “The risk business are now seeing their control reports earlier in the day than they’d ever seen in their history,” recalls Steve. “The product control team also saw a dramatic reduction in the time it took to re-run a report after making adjustments. This used to take them 45 minutes, but since we migrated to Snowflake it now completes in less than 5 minutes”.
Re-running a report after making adjustments used to take the product control team 45 minutes. Since migrating to Snowflake, that time is now less than 5 minutes.
Steve and his team have also developed valuable intellectual property to facilitate customer delivery. They have built a data transformation framework specifically for the client’s context, which is used in their production data pipelines that run against Snowflake. They have also developed a tool called Gondola that compares database structure across different environments to facilitate CI/CD and DataOps operation. These tools help streamline and improve the overall data flow within the organisation.
The benefits of these data projects go beyond technical improvements and cost savings. Steve emphasises the business aspect, where the move to Snowflake has delivered real value. “Snowflake is now consuming both overnight batch data as well as real-time streaming data throughout the day. As soon as something changes in the source system, our tool detects it, takes a copy of it, and updates it in Snowflake as well. This ‘near real-time’ data streaming enables the organisation to make more informed and timely decisions.”
Steve Jenkings and his team tackled the challenges of legacy infrastructure head on, adopting a systematic approach that combined strategic planning, meticulous migration, and the use of modern data tools such as Snowflake. The results speak for themselves: improved efficiency, reduced costs, faster decision making and the ability to extract maximum value from data. This success story is a testament to the power of innovation and adaptation in the ever-evolving landscape of Data Engineering.
About Projective Group
Established in 2006, Projective Group is a leading Financial Services change specialist. With deep expertise across practices in Data, Payments, Transformation and Risk & Compliance.
We are recognised within the industry as a complete solutions provider, partnering with clients in Financial Services to provide resolutions that are both holistic and pragmatic. We have evolved to become a trusted partner for companies that want to thrive and prosper in an ever-changing Financial Services landscape.