Data Developer

Location:
Buffalo, NY / Remote

We believe in a world where data analytics is not a privilege but a universal right and simple enough so anyone use it to drive the profitability of their businesses to transform the world. With industries like accounting and manufacturing facing massive workforce disruptions (i.e., 75% of the workforce retiring in the next 10 years), we’re positioned to step in and change how data drives decisions.

HeronAI (MIT / FinTech, SaaS) empowers mid-sized firms to streamline strategic growth and month-end advisory reporting. Using proprietary algorithms to help mid-sized companies connect all their disparate data into one place, analyze it, create visual dashboards, and action plans to drive strategic decision-making. Our goal is to help reduce analysis and reporting time from 4-12 weeks to under 5 minutes

Our goal not to be just a tool, but to reshape how data drives decisions for over 10 million people in the next 10 years.

Learn more about our Mission and Vision and see HeronAI in the media

Our goal is to reshape how businesses use data by reducing reporting time from weeks to minutes, helping industries like accounting and manufacturing overcome challenges like workforce disruptions. As a Data Developer, you’ll play a critical role in optimizing data pipelines, integrating diverse data sources, and building the foundation for scalable, secure analytics.

What You’ll Do

  • Design, implement, and maintain scalable ETL pipelines for data ingestion and transformation from tools like QuickBooks, Excel, and other financial platforms.
  • Collaborate with engineering teams to ensure seamless data flow between backend APIs, databases, and visualization tools.
  • Optimize the performance of databases (e.g., Postgres, DynamoDB), ensuring efficient handling of large-scale, complex datasets.
  • Develop and maintain data models that align with analytics and reporting needs.
  • Work closely with the DevOps team to ensure data pipelines are secure, reliable, and cost-efficient.
  • Conduct data quality checks and implement automated validation processes to ensure accuracy and consistency.
  • Troubleshoot and resolve data-related issues, ensuring minimal downtime and disruption.
  • Contribute to the company’s SOC 2 compliance efforts, ensuring data security and privacy protocols are followed.

What Makes This Role Exciting

  • As one of our first hires, you’ll have the freedom to define how we build solutions.
  • Choose the methodologies and tools that best meet our customers’ needs.
  • Work on a high-growth platform backed by MIT, Harvard, Techstars, and Forbes-recognized innovation.
  • Join us at an exciting time—we’ve grown our waitlist from 250 to 1,700 users in 10 months and recently secured $1.5M in seed funding.
  • Solve critical gaps in the data analytics market and automate workflows for industries facing major workforce transformations.

Who You Are

  • A professional with 5+ years of experience in data engineering, ETL development, or a related field.
  • Proficient in Python or another programming language for data processing.
  • Skilled in designing, optimizing, and managing databases like Postgres or DynamoDB.
  • Experienced in building scalable ETL pipelines for data ingestion, transformation, and storage.
  • Knowledgeable about cloud platforms (AWS preferred) and tools like Lambda, S3, and Redshift.
  • Comfortable with data modeling and understanding the needs of analytics platforms.
  • Passionate about ensuring data accuracy, security, and reliability in fast-paced environments.
  • Familiar with SOC 2 compliance or other data security frameworks.

Exceptional Candidates Will Bring

  • Experience integrating data from financial systems (e.g., QuickBooks, Excel).
  • A strong understanding of data visualization requirements, including performance optimization for dashboards.
  • Familiarity with real-time or near-real-time data processing pipelines.
  • A history of contributing to high-growth startups or scaling data-intensive platforms.

What Good Looks Like

Q1 Targets:

  • Build ETL pipelines for QuickBooks and Google Sheets with robust data validation.
  • Deliver backend support for 5 pre-designed dashboard templates.
  • Ensure dashboards generate actionable insights with no data errors.

Q2 Targets:

  • Support incremental API integrations for Xero and HubSpot.
  • Optimize ETL performance to reduce load times by 30%.
  • Develop automated data cleaning and deduplication tools to reduce user intervention.

Q3 Targets:

  • Scale ETL pipelines to support 3,000 concurrent users.
  • Add support for multi-dataset analytics with "Ask Jules".
  • Deliver predictive analytics capabilities for proprietary metrics like Variance Analysis.

Q4 Targets:

  • Support enterprise-level integrations (e.g., SAP and Salesforce) with scalable ETL processes.
  • Build new ETL templates for Predictive Churn and Competitor Comparison metrics.
  • Ensure data pipelines can handle unstructured data sources with minimal user effort.

Why You’ll Love Working With Us

  • Be part of a company that’s saving businesses hours every week and driving smarter growth.
  • Work with a team of passionate innovators who see this as a once-in-a-lifetime opportunity to change the game.
  • Develop alongside some of the best minds in tech, with connections to MIT, Harvard, and Techstars.
  • We believe in working hard, celebrating wins, lots of ‘Fika’, and building a strong team dynamic that fosters creativity and collaboration.

Ready to Join Us?

If you’re excited about building for impact and love the idea of simplifying the complex, we’d love to hear from you.

Apply here