◆THE ROLE & THE TEAM
As a Data Engineer in JKOPay, you will be responsible for processing JKOPay’s online transaction data and offline data analysis with your experience in data engineering. JKOPay serves +5 million users in Taiwan, as well as partners and stakeholders that integrate with our payment service, which handles 3-billion transaction amounts every month.
With the rapid growth of JKOPay’s business, Data Engineer is the key to designing systems for collecting, storing, and analyzing data at scale, as well as convert raw data into usable information for business analysis to interpret.
This is an opportunity to help shape and expand JKOPay’s ETL data pipeline. If you are interested in handling multiple-terabyte data and improving engineering efficiency, in a great team that keeps growing, this role is for you.
◆WHY YOU SHOULD BE INTERESTED
-Translate business requirements & end to end designs into technical implementations and responsible for building batch data warehouse.
-Manage data modeling design, writing, and optimizing ETL jobs.
-Collaborate with the business team to build data metrics and data visualization based on data warehouse.
-Responsible for building and maintaining data products.
-Involvement in rollouts, upgrades, implementation, and release of data system changes as required for streamlining of internal practices.
◆WE'D LOVE TO MEET YOU IF
-At least 2 years of relevant experience in data engineering.
-Proficient in creating and maintaining complex ETL pipeline end-toend while maintaining high reliability and security.
-Familiar with backend engineering and practical knowledge of shell, SQL, and one more programming language (e.g. Python, Scala, etc.).
-Familiar with the common distributed computing engine or platform (e.g. ClickHouse, Spark, Kafka, BigQuery, Airflow, etc).
-Familiar with at least 1 data exploration and visualization platform (e.g. Superset, Tableau).
-Familiar with data lake, data warehouse, and data mart concept, and have production experience in modeling design.
-Familiar with at least 1 Cloud data flow solution (e.g. AWS Data Pipelines, GCP Dataflow, etc.) is a plus.
-Experience with Data Catalog is a plus.
-Excellent interpersonal and communication skills with the ability to engage and managing internal and external stakeholders across all levels of seniority.