Caylent Services
Data Modernization & Analytics
From implementing data lakes and migrating off commercial databases to optimizing data flows between systems, turn your data into insights with AWS cloud native data services.
Lincoln Savings Bank (LSB) opened for business in 1902. Today, it is a full-service bank offering deposit accounts, loans, and other traditional banking services. LSB has 16 branches and its assets total about $1.8 billion.
LSB operates a subsidiary called LSB Financial Services, which handles customers' investment, trust, and estate planning needs. Also, a Banking as a Services (BaaS) division, LSBX merges traditional payment systems with new and emerging technologies and provides a wide range of services to the rapidly growing financial technology industry.
LSB aims to leverage data to strategize towards customer base expansion, increase their deposit base and volume of loans, increase fee income, lower customer acquisition cost, and reduce fraud.
Aligned towards these strategic goals, LSB was looking to extend and enhance visibility into their business data and capitalize on various data sources in new and exciting ways. Historically, LSB's infrastructure has been fully on-premise and LSB was utilizing external SaaS software for their data management and reporting. This limited their access to comprehensive reporting customized to their unique needs, which would prove difficult, cost-prohibitive, and time-consuming.
To overcome these challenges, LSB engaged Caylent to gather data from their on-premise and SFTP data sources into a single data warehouse in the cloud, easing access to data, consolidating reporting, and enabling business intelligence (BI).
Nick Suender
CIO
With a data warehouse, Lincoln Savings Bank now has secure, compliant, and governed foundations to consolidate and manage their data and enable capabilities like data reporting, BI & analytics. Their operational and executive teams benefit from significantly greater visibility into their data and can customize reports to their specific needs, allowing them to back strategic decisions with data.
Generating custom reports with their prior infrastructure was either too difficult or would require expensive requests to vendors that could take a long time to address, reducing the applicability and usefulness of their data. Now, LSB can generate reports within 1-2 days and update them frequently to maximize data veracity. Data aggregation, access, and visualization are secure and compliant, and their cloud infrastructure offers greater scalability and DR capabilities to ensure smooth and reliable operations.
Greater access to data and a data governance culture have broken down silos within the organization, improving the quality of communication. LSB now has the ability to drive decisions from a data lens, and they can generate key insights to better understand their customer lifecycles, increase value, improve customer-product fit, and detect fraud.
The initial requirements for the data warehouse deployment were to establish a secure, governed foundation and integrate the first four of many data sources. The Caylent team gathered the requirements for governance, transformation, and data consumption, and architected a solution including the infrastructure and ETL scripts necessary to populate this new data warehouse, along with initial reports and required dashboards.
The ingestion method to the S3 bucket varies depending on the data source. Lambda scripts run at a scheduled time to get data from SFTP servers, while DMS is used to get data from on-premise databases. Only services and scripts with the right IAM policy can write to S3 buckets. Amazon SNS is utilized for notifications.
The first step of the ingestion process is moving data to the landing S3 bucket. The data is temporarily stored and then deleted after the tokenization process using Protegrity. The tokenized data is stored in another S3 bucket. Data in flight during ingestion is encrypted. AWS Lake Formation (LF) is used to control access to the buckets, and Glue Data Catalog was used to add LF tags in the data catalog. Finally, the LF tags are propagated to Snowflake.
In the Snowflake data warehouse, we have three layers: raw, curated, and consumption layer. The raw layer uses external tables, which means the data is physically stored on S3, but Snowflake reads it from there. The other two layers use tables stored in Snowflake.
dbt, a transformation tool, supports transformation logic, orchestration, change management, tests, and data quality assurance. Additionally, Caylent deployed a right-to-be-forgotten process, enabling LSB to delete personal data from the DW layers accessed by the business analysts while keeping the raw data for compliance purposes.
Finally, the architecture is integrated with PowerBI to do data visualization and data analysis and SageMaker Canvas to build models and enable AI.
The Caylent team assessed different providers and recommended Atlan for data cataloging, data lineage, and tagging (for PII, SPI, etc.). Also, Protegrity is used as a tool to tokenize data and de-identify customer IDs and other PIIs. Both tools are available in AWS Marketplace.
A data retention process was defined using life cycle rules for S3, data quality dimensions applied with dbt, and data security features, like enforced data encryption in transit, using SSL, role-based access, and server-side encryption for data at rest using SSE-KMS.
In Snowflake, we de-tokenize in SQL at runtime depending on the user’s role and the governing Protegrity security policy. Snowflake automatically rotates master keys that are 30 days old. A private link is established between AWS and Snowflake.
Finally, the Caylent team implemented an approach to comply with the LSB’s customers’ “right to be forgotten,” in addition to keeping protected raw data for compliance and audit purposes.
Caylent Services
From implementing data lakes and migrating off commercial databases to optimizing data flows between systems, turn your data into insights with AWS cloud native data services.
Caylent Catalysts™
From implementing data lakes & migrating off commercial databases to optimizing data flows between systems, turn your data into insights with AWS cloud native data services.
Speed To Generate Custom Reports From Their Data