Are you able to energy the World’s connections?
We’re in search of a Information Platform Engineer to affix our crew. On this position, you’ll design, develop, and preserve scalable knowledge pipelines and programs, leveraging fashionable knowledge engineering instruments and methods. You’ll collaborate with cross-functional groups to make sure knowledge is accessible, dependable, and optimized for analytics and decision-making processes.
This place requires deep experience in dealing with large-scale knowledge programs, together with Snowflake, Kafka, dbt, Airflow, and different fashionable ELT/Reverse ETL applied sciences.
What you may be doing:
Design & Construct Scalable Information Pipelines: Develop and preserve real-time and batch knowledge pipelines utilizing instruments like Kafka, dbt, and Airflow/Snowpark.
Information Modeling: Implement and optimize knowledge fashions in Snowflake to assist analytics, reporting, and downstream functions.
Implement ELT Processes: Construct environment friendly ELT pipelines for remodeling uncooked knowledge into structured, queryable codecs.
Reverse ETL Options: Allow operational analytics by implementing Reverse ETL workflows to sync processed knowledge again into operational instruments and platforms.
Information Integration: Work with APIs, third-party instruments, and customized integrations to ingest, course of, and handle knowledge flows throughout a number of programs.
Automation: Leverage orchestration instruments like Apache Airflow or Snowpark to automate workflows and enhance operational effectivity.
Collaboration: Companion with Information Scientists, Analysts, and Product groups to grasp enterprise necessities and ship actionable knowledge insights.
Governance & Safety: Implement and preserve knowledge governance insurance policies and guarantee compliance with knowledge safety greatest practices.
What You may Deliver:
Technical Experience: Expertise with Snowflake: design, optimization, and question efficiency tuning. Arms-on expertise with Apache Kafka for streaming knowledge. Proficient in dbt for remodeling knowledge and creating reusable fashions. Experience in Apache Airflow or related orchestration instruments. Data of ELT and Reverse ETL ideas.
Programming: Robust proficiency in Python, Java and SQL.
Information Methods: Expertise working with fashionable knowledge ecosystems, together with cloud-based architectures (AWS, Azure, GCP).
Information Modeling: Expertise constructing and managing knowledge warehouses and dimensional modeling.
Downside-Fixing: Robust analytical and debugging expertise to deal with complicated knowledge engineering challenges.
Collaboration: Glorious communication expertise to collaborate with technical and non-technical stakeholders.
Kong has completely different base pay ranges for various work areas inside the US, which permits us to pay staff competitively and constantly in several geographic markets. Compensation varies relying on a wide selection of things, together with however not restricted to particular candidate location, position, talent set and stage of expertise. Sure roles are eligible for added rewards, together with gross sales incentives relying on the phrases of the relevant plan and position. Advantages might differ relying on location. The standard base pay vary for this position in is CAD 123,025.00 – 147,677.50.
About Kong
Kong Inc., a number one developer of cloud API applied sciences, is on a mission to allow corporations world wide to change into “API-first” and securely speed up AI adoption. Kong helps organizations globally — from startups to Fortune 500 enterprises — unleash developer productiveness, construct securely, and speed up time to market. For extra details about Kong, please go to
www.konghq.com or observe us on X @thekonginc.