See all the jobs at NASPO here:
| Part-time | Fully remote
About the Role
We are seeking a motivated Data Engineer Intern to join our team. This internship provides hands-on experience working with data pipelines, database management, and integrations across key enterprise systems. The role is ideal for students or recent graduates who want to develop strong technical and problem-solving skills in data engineering and AI-driven solutions.
Pay: $20/hour
How You’ll Contribute to the NASPO Data Team
-
Support procurement analytics by building pipelines that consolidate data from Salesforce (e.g., supplier, contract, and membership data) into PostgreSQL databases.
-
Help improve data quality and consistency across systems to ensure accurate reporting for NASPO leadership and stakeholders.
-
Assist in developing AI-enabled data preparation workflows that can identify trends, outliers, or predictive insights in procurement data.
-
Work on automation scripts in Python to streamline repetitive reporting tasks and reduce manual effort for the data team.
-
Provide documentation and process improvements to help the NASPO team make data-driven decisions faster.
Responsibilities
-
Assist in designing, building, and maintaining ETL/ELT pipelines.
-
Work with PostgreSQL databases for data modeling, transformation, and performance optimization.
-
Develop and maintain Python scripts for data processing and automation.
-
Support integration between Salesforce and internal data systems.
-
Collaborate with data engineers, analysts, and business stakeholders to understand requirements.
-
Ensure data quality, accuracy, and consistency across systems.
-
Document technical processes, workflows, and project progress.
-
Support preparation of datasets for AI/ML use cases, including feature engineering and preprocessing.
-
Assist in evaluating AI/ML model outputs and integrating them into data pipelines.
Qualifications
-
Currently pursuing or recently completed a degree in Computer Science, Data Engineering, Information Systems, or a related field.
-
Familiarity with Python programming (pandas, SQLAlchemy, or similar libraries).
-
Basic understanding of relational databases, particularly PostgreSQL.
-
Exposure to Salesforce data structures or APIs is a plus.
-
Familiarity with Power BI or any reporting tool.
-
Interest or coursework in AI/ML concepts (e.g., scikit-learn, TensorFlow, PyTorch) is a bonus.
-
Strong analytical and problem-solving skills.
-
Good communication skills and ability to work in a team environment.
What You’ll Gain
-
Real-world experience in data engineering and AI-driven integration projects.
-
Hands-on practice with Python, PostgreSQL, and Salesforce technologies.
-
Exposure to preparing and working with datasets for machine learning workflows.
-
Mentorship from experienced engineers and exposure to enterprise data and AI solutions.
-
Opportunity to contribute to meaningful projects that impact business operations.