Data Engineer

Deadline 15 March, 2019 Position Ref: 1092
Skills Required

XML JavaScript GIT MongoDB XSLT NodeJS Solr SPARQL ETL tools Pentaho Data Integration RDF DCAT DCAT AP DBM systems


Additional Skills

Python


Expertise

Data Engineer


Language

English


Total Experience (months or years)

84


Description

  • Create and maintain optimal data pipeline architecture.
  • Develop, maintain and review the data catalogue that is composed of several components (CKAN , EntryScape,Node.js,ETL, Pentaho)
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc
  • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
 
Qualifications
 
  • Master's degree, minimum 5 years of experience after secondary school
  • 7-13 years of professional experience (apart from studies)
  • Strong knowledge of Extract Transform Load (ETL) processes
  • Strong knowledge of semantic web technologies, especially Resource Description Framework (RDF), SPARQL and triplestores (e.g., virtuoso)
  • Knowledge of the Pentaho data integration platform would be preferable  
  • Strong knowledge of XML and XSLT
  • Prior experience with semantic web vocabularies especially DCAT and knowledge of the DCAT AP standard
  • Experience with version control systems (Git)
  • Strong knowledge of Javascript
  • Knowledge of Python an would be an asset
  • Knowledge of Solr indexing and search platform
  • Knowledge of DBM systems
  • Knowledge of node.js 
  • Knowledge of MongoDB


Location

Ispra


ITALY


Duration

1 Years


© 2019 Apogee Information Systems. [s1.Enki] All Rights Reserved. Terms of Service