Data Engineer

Viercode is a polish software company that connects best tech-driven engineers with best companies around the world in need of building or scaling their technology teams. We build high-performing teams of software engineers for the world’s leading brands. 

If you want to join our team, Viercode can guarantee you a professional development through work among professionals and experts, in a creative and friendly atmosphere. We also provide a lack of routine in performing daily duties. 

We value your time and your involvement. That’s why you have the possibility to work in flexible hours. Moreover, we assure reward adequate to engagement and performance.

We are currently looking for a Data Engineer to join Our Client project team in order to provide services in the full life-span of software production, such as: analysis and development of functional specifications, data analysis in source systems, defining business requirements, design, development, post-implementation maintenance of IT systems dedicated to support modern business processes, support in UAT end user tests.

Remuneration depends on the level of experience in the position, as well as the outcome of the recruitment process. 

Mid 80-130 PLN / per hour + vat (B2B) 
Senior 130-170 PLN / per hour + vat (B2B) 

If you want to work in a powerful network surrounded by passionated experts - this is the job you have been looking for!

Don't wait - send us your application to be a part of the Viercode team -  most energizing community of engineers and get long-term projects for various industries!


  • Data Acquisition and Integration: Design and develop the data acquisition and integration processes to collect and ingest data from various sources, including structured and unstructured data, and integrate it into data storage and processing systems.

  • Data Modeling and Storage: Develop and maintain data models, schema, and database designs for efficient storage and retrieval of data, ensuring scalability, availability, and security.

  • Data Processing and Transformation: Create data processing pipelines to transform and clean the data using various tools and technologies such as ETL, Spark, Hadoop, and SQL.

  • Data Quality and Governance: Establish and enforce data quality standards and governance policies to ensure that data is accurate, consistent, and secure across different systems and applications.

  • Performance Optimization: Optimize the performance of data storage and processing systems, including indexing, partitioning, and clustering, to improve query and data processing times.

  • Infrastructure Management: Configure and manage the infrastructure required to support data storage, processing, and analysis, including cloud platforms, databases, servers, and networks.

  • Collaboration and Communication: Collaborate with cross-functional teams, including data scientists, analysts, and business stakeholders, to understand their data requirements and provide the necessary data infrastructure and support.

  • Documentation and Reporting: Document data processes, infrastructure, and systems, and provide regular reports and updates to stakeholders on the status and performance of the data storage and processing systems.


  • 2+ years of experience and a strong understanding of AWS Data Infrastructure;

  • 2+ years of experience and strong knowledge of Glue and Lambda;

  • 2+ years of experience using Python in a Data environment;

  • Experience in Paradigm for Lakes and Lakehouses;

    Nice to have:

  • Experience with Qlik Replicate;

  • Strong knowledge and understanding of Glue Catalogue;

  • Experience working with Spark;

  • Basic understanding of Delta Lake;

  • Other AWS applications: Lake Foundation, Data Encryption, Event bus;

  • Experience working in an insurance or Fin-Tech industry company.


  • Interesting and ambitious projects - no routine or repeatability of activities
  • Flexible working hours
  • The possibility for taking part in international projects
  • Excellent work atmosphere, including team events, hackathons
  • Work atmosphere as close as possible to a startup within a large corporation

Terms of cooperation

  • Fully remote work
  • Full-time work
  • English at B2 level
  • Polish at min B2 level