|This posting is managed by:||Fidel Consulting KK|
|Company Name||Company is not publicly visible|
IT (Other) - Data Analyst/Data Scientist
IT (PC, Web, Unix) - Web Application SE
IT (PC, Web, Unix) - Database SE
|Industry||Internet Services/ISP (Internet Service Provider)|
• Consult with business stakeholders on reporting needs, collect requirements, and document report background and technical specifications. Work with the same stakeholders to collect feedback on report contents.
• Conduct ad-hoc analysis to confirm requirements and important report metrics definitions, often working together with analysts in credit risk and/or data science. Identify, investigate, and solve data discrepancies in reports.
• Design and document ETL pipelines and data structures upstream from the final generated reports. Find ways to optimize the ETL in the pipeline and adjust data models/ structures upstream to increase flexibility, efficiency, or ease of achieving high accuracy in reports
• Monitor report generation, provide first-line support for any issues with deployed reporting pipelines, and generated reports
• Working with data scientists, designing and deploy BI tool dashboards to add value to reporting pipelines for stakeholders (we use Looker)
• Contribute to data lake/warehouse development and other data integration projects
The Company offers instant, monthly-consolidated credit to consumers by removing hassles from payment and purchase experiences. Uses proprietary models and machine learning to underwrite transactions in seconds and guarantee payments to merchants. The Company increases revenue for merchants by reducing the number of incomplete transactions, increasing conversion rates, boosting average order values, and facilitating repeat purchases from consumers.
This organization continues to innovate to make shopping easier and more fun both online and offline.
• A Bachelor Degree in Computer Science, Information Technology, or a related subject
• 3+ years experience in data engineering or software development of data-intensive applications
• Strong skills in Python, Git, Docker, SQL, some ETL tool/job orchestration tools such as Airflow or Prefect (we use Perfect), and ETL pipelines.
• Knowledge and experience with either Scala or Java are a huge plus
• Knowledge and experience with Apache Spark a plus
• Experience with financial/accounting products or services, in particular, technical implementation impact and consideration
• Creative Problem-solving skills, a passion for programming and solving problems with code
|English Level||Business Conversation Level (TOEIC 735-860)|
|Japanese Level||Minimum Communication Level|
|Salary||JPY - Japanese Yen JPY 8000K - JPY 10000K|