Data Pipeline Engineer

C Innovation Studio
Remote

Requirements



  • 5+ years experience - you were responsible for building and maintaining ETL pipelines

  • You have used multiple languages such as Java, Python, C++, and Javascript

  • Experience with big data platforms such as Hadoop, Spark, Bigquery, etc.

  • Experience creating data pipelines and backend aggregations

  • Experience with ETL workflows on data pipelines with tools such as Apache Spark, Apache Beam, Apache Airflow, Smartstreams, Fivetran, or AWS Glue

  • Experience with Cloud Data Warehouse - Redshift, Snowflake, Bigquery or Synapse

  • You are comfortable manipulating large data sets and handle raw SQL

  • Clear communicator

Keep in mind you never have to pay to apply. Never pay for equipment or training. The basic rule is; NEVER pay for anything when applying. When talking to the job poster, make sure you're talking to someone from the actual company. By clicking the apply button you will leave Remotebond to the external application website.


Remotebond accepts no liability or responsibility as a consequence of any reliance upon information on there (external sites) or here.

More than just a job board – an online community with a common goal of helping passionate freelancers and employers to share knowledge and shape the future of work.

Send feedback or just say Hi