Linker Finance is a fintech company helping community banks grow in the digital age. We build core-agnostic technology that enables banks to launch modern digital banking experiences, streamline retail and business onboarding, and grow deposits without replacing their core systems.Our platform supports online deposit account opening, business onboarding, digital banking, and relationship-building tools for both retail and commercial banking. By combining modern user experiences with the operational, compliance, and integration realities of community banks, Linker Finance helps financial institutions move faster, serve their customers better, and compete more effectively in an increasingly digital market. We are building technology for a critical part of the financial ecosystem: the community banks that support local businesses, families, and communities across the country.
The Data Engineering Intern will support the development of Linker Finance's data platform, focusing on ingesting, structuring, and operationalizing data across customer, platform, and document-based workflows. This role focuses on helping build scalable data pipelines and foundational data models that power analytics, reporting, platform visibility, and future AI-enabled use cases. This internship blends data engineering fundamentals with real-world fintech needs, including document processing, customer metrics, operational reporting, and data quality. The intern will work closely with leadership and engineering to help establish the data backbone needed to support internal decision-making, customer-facing insights, and long-term platform scalability.
• Support ingestion of structured and unstructured data into the platform, including customer, transaction, operational, and document-based data.
• Assist in building and maintaining pipelines that move data from source systems into storage and analytics layers.
• Help transform raw data into clean, structured, and queryable formats that can support reporting and downstream applications.
• Contribute to improving data quality, consistency, traceability, and repeatability across pipelines and environments.
• Support extraction of structured information from PDF and document-based inputs such as onboarding records, business documents, and financial forms.
• Assist in integrating document-processing tools such as AWS Textract or similar services to improve automated extraction workflows.
• Help define schemas and data structures for extracted forms, tables, and entities.
• Participate in improving the usability, accuracy, and organization of extracted data for downstream analytics and operations.
• Assist in designing logical data models for customer, account, transaction, and platform activity data.
• Support development of datasets that track important business and customer metrics such as total users, active users, deposits, external account connections, and transaction volumes.
• Help organize data in ways that support customer health visibility, platform reporting, and decision-making for leadership.
• Contribute to building a strong foundation for scalable, multi-tenant data practices over time.
• Support creation of dashboards and reports that improve visibility into platform activity, customer behavior, and operational performance.
• Assist in ensuring that required data is available, reliable, and structured correctly for reporting needs.
• Help connect business questions to usable datasets and reporting outputs.
• Explore ways to improve internal data accessibility for leadership, product, engineering, and customer-facing teams.
• Assist in defining data validation rules, quality checks, and lightweight governance practices.
• Support documentation of data flows, transformations, schemas, and dependencies.
• Help ensure that expected data outputs align with actual platform behavior and reporting needs.
• Contribute to building repeatable, well-documented patterns for future data engineering work.
• Currently enrolled in a college or university program related to Computer Science, Data Engineering, Data Science, Information Systems, or a similar field.
• Experience with Python, SQL, or similar programming languages.
• Basic understanding of data pipelines, ETL concepts, and structured data modeling.
• Familiarity with spreadsheets, data analysis, or dashboarding tools is a plus.
• Exposure to cloud platforms such as AWS is a plus.
• Strong analytical thinking, attention to detail, and interest in solving real-world data problems.
• Experience working on team-based technical or academic projects.
How to Apply
If you’re interested in joining Linker Finance, send your resume to: careers@linkerfinance.com
To help us review your application, please include: