Applications for this position are currently paused
Save
Job updated 5 months ago

Job Description

Responsibilities Include:

  • Collect, define and document engineering data requirements.
  • Design and develop the data pipeline to integrate the engineering data into the data lake
  • Design analytics database schema
  • Automate and monitor ETL/ELT jobs for analytics database
  • Design data model to integrate with existing business data
  • Work with to existing team to integrate, adapt, or identify new tools to efficiently collect, clean, prepare, and store data for analysis
  • Design and implement data quality checking steps to ensure high data quality for dashboard and ML/AI models
  • Provide technical and backend configuration support to the engineering applications

Requirements

Basic Qualifications:

  • Bachelor’s degree in Computer Science, Mathematics, Engineering, or in a related field
  • Experience writing shell scripts, scheduling cron jobs and working with Linux environment.
  • Experience using Airflow or similar data pipeline tools
  • Experience using Gitlab / GitHub or similar version control tools
  • 4+ years’ experience with object-oriented programming language: Python, Java, etc.
  • 4+ years’ experience building processes supporting data pipeline, data cleansing / transformation, data quality monitoring
  • 4+ years’ experience in DB schema, data pipeline design and database management
  • 4+ years’ experience in optimizing data pipelines, architectures and data sets Fluency in structured and unstructured data and management through modern data transformation methodologies
  • Experience with engineering data management tools: Cadence, Pulse, Windchill, ELOIS, Creo or similar tools
  • Strong analytical, problem solving, verbal and written communication skills
  • Not afraid of conflicts, and able to build consensus through direct interaction and compromise
  • Ability to work effectively cross-culturally and across multiple time zones
  • Ability to work with cross-functional teams and stakeholders

Preferred Qualifications:

  • 4+ years’ experience with designing and managing data in modern ETL architect like Airflow, Spark, Kafka, Hadoop, Snowflakes
  • 4+ years’ experience in optimizing data pipelines, architectures and data sets fluency in structured and unstructured data and management through modern data transformation methodologies
  • Experience working with engineering data or similar data
  • Experience with creating API for MS SQL database
  • Experience with Microsoft Power Platform
  • Experience with designing and developing dashboard
  • A successful history of manipulating, processing and extracting value from large datasets
  • Brings established relationships across Lenovo ISG to the role
  • English language proficiency is preferred, and Mandarin capability is advantaged

Interview process

The interview will be conducted via Teams.

Minimum 2 and Max 3 Rounds interview will be conducted by Team which based in the US

View all jobs
View all jobs
Save
1
5 years of experience required
1,300,000+ TWD / year
Partial Remote Work
Personal Invitation Link
This is your personal referral link for job invitation. You'll receive an email notification when someone applied for the position via your job link.
Share this job
People who applied for this job also applied for
Logo of the organization.
Full-time
Entry level
2
Regular earnings reach NT$40,000
Logo of the organization.
Full-time
Entry level
1
40K ~ 76K TWD / month
Logo of Lenovo_台灣聯想環球科技股份有限公司.

About us

Lenovo是一家營業額達600億美元,名列《財富》世界500大,業務遍及180個市場的全球化科技企業。 我們專注於“Smarter Technology for All”的願景,每天為數百萬名客戶提供智慧化裝置與基礎架構,並幫助他們打造智慧化解決方案、服務與軟體,期待攜手成就一個更具包容性、更值得信賴且可持續發展的數位化未來。

價值觀是我們行為的準則,Lenovo的價值觀包含: -服務客戶 -誠信共贏 -創業精神 -開拓創新。


Team

Avatar of the user.
Talent Acquisition Team
Avatar of the user.
Talent Acquisition Specialist
Avatar of the user.
Talent Acqusition Specialist
Avatar of the user.
Talent Acquisition Coordinator
Avatar of the user.
Talent Acquisition Partner