🔧 Data Engineer / Data Scientist (m/w/d)
🔧 Data Engineer / Data Scientist (m/w/d)

Location: Berlin, Vienna, Cologne, Aachen, Frankfurt, Nuremberg (hybrid possible)
Employment: Full-time or part-time, starting immediately
Contact: anna@scavenger-ai.com
Scavenger AI makes data analysis as easy as chatting. We enable decision-makers in industrial companies to get data-driven recommendations in real time using natural language.
As a Data Engineer / Data Scientist, you’ll be responsible for ensuring that our AI understands business-related data just as well as a human would.
Your responsibilities:
You build and maintain ETL workflows using tools like Airflow.
You develop and improve data-cleaning algorithms in Python.
You integrate structured data sources (e.g. ERP, CRM, product databases) into our platform.
You collaborate closely with AI and frontend teams to quickly implement new use cases.
What you bring:
At least 2 years of experience building ETL pipelines, strong Python skills
Solid understanding of Business Intelligence, gained through work experience or academic projects
Motivation to take ownership and help shape structures
Experience in data modeling and ideally with cloud or DevOps tools (e.g. AWS, Docker, CI/CD)
Excellent English skills (German is a plus)
Strong communication, openness in teamwork, and constructive feedback mindset
What to expect:
Work in an ambitious AI tech startup with real-world impact
A high degree of creative freedom and influence on our data architecture
Flexibility: part-time models (e.g. 60–80%) possible
Direct collaboration with an interdisciplinary team on equal footing
We look forward to your application!
Location: Berlin, Vienna, Cologne, Aachen, Frankfurt, Nuremberg (hybrid possible)
Employment: Full-time or part-time, starting immediately
Contact: anna@scavenger-ai.com
Scavenger AI makes data analysis as easy as chatting. We enable decision-makers in industrial companies to get data-driven recommendations in real time using natural language.
As a Data Engineer / Data Scientist, you’ll be responsible for ensuring that our AI understands business-related data just as well as a human would.
Your responsibilities:
You build and maintain ETL workflows using tools like Airflow.
You develop and improve data-cleaning algorithms in Python.
You integrate structured data sources (e.g. ERP, CRM, product databases) into our platform.
You collaborate closely with AI and frontend teams to quickly implement new use cases.
What you bring:
At least 2 years of experience building ETL pipelines, strong Python skills
Solid understanding of Business Intelligence, gained through work experience or academic projects
Motivation to take ownership and help shape structures
Experience in data modeling and ideally with cloud or DevOps tools (e.g. AWS, Docker, CI/CD)
Excellent English skills (German is a plus)
Strong communication, openness in teamwork, and constructive feedback mindset
What to expect:
Work in an ambitious AI tech startup with real-world impact
A high degree of creative freedom and influence on our data architecture
Flexibility: part-time models (e.g. 60–80%) possible
Direct collaboration with an interdisciplinary team on equal footing
We look forward to your application!