Cloud Integration Technical Lead
Location: ChicagoPosted On: 12/13/2022
Requirement Code: 62025
Requirement Detail
Required :
Responsibilities:
Perform technology or tools assessment to determine market trends and 'fit' in KPMG landscape to support evolving business needs.
Evaluate technologies or design patterns and oversee their implementation from concept to working product.
Perform technical design and integration activities to enable for Informatica EDC, Dataiku, Protegrity, Denodo, Snowflake.
Design and development of technology and data architectures to support solutions leveraging best practices across data management and data governance.
Exhibit strong knowledge of the Informatic Enterprise Data Catalog(EDC)and can clearly articulate the value proposition of Data Management products(Collibra, EDC, Databricks Unity Catalog).
Act as a technical liaison between business and developer teams, helping translate user and business needs into an understandable architecture for developers.
Drive processes, and tools necessary to ensure efficient / effective data governance, data management, business / data catalog development, data lineage, data management, data stewardship, data quality, analytics, machine learning, and business intelligence.
Lead and support technical teams in developing and deploying data products and/or production-level AI and Machine Learning solutions by leveraging modern open source or enterprise ready tools and technologies.
Identify opportunities and efficiencies in Data Engineering & Data Integrations with best of class tools and technologies and support implementation of technical patterns
Work with product leadership providing valuable product insight, guiding product direction and features.
Deliver architectural blueprints to support enablement of new integration patterns.
Collaborates with cross functional teams to understand their business, align IT and data capabilities to areas of the business that can provide value, and develop roadmaps to help them achieve their missions.
Understand and communicate solutions on how best to move work and workloads to more productive solutions and environments through digital transformations.
Requirements/Qualifications:
Minimum 10years of experience working in a data centric environment with strong understanding of enabling data capabilities and end to end data product lifecycle.
5-7 years of demonstrated track record and expertise in architecting technical resolutions that include Microservices, AI/ML platforms (Dataiku, Data Robot, R Studio, Databricks Lakehouse Platform), data management & pipelines (Spark, AirFlow, Synapse, ELK), continuous integration / continuous deployment (Jenkins, Ansible, Terraform), API Management (Apigee, Kong).
Strong understanding of Azure, Databricks and Data Warehouse concepts.
Experience with ServiceNow.
Experience with Agile methodology.
Responsibilities:
Perform technology or tools assessment to determine market trends and 'fit' in KPMG landscape to support evolving business needs.
Evaluate technologies or design patterns and oversee their implementation from concept to working product.
Perform technical design and integration activities to enable for Informatica EDC, Dataiku, Protegrity, Denodo, Snowflake.
Design and development of technology and data architectures to support solutions leveraging best practices across data management and data governance.
Exhibit strong knowledge of the Informatic Enterprise Data Catalog(EDC)and can clearly articulate the value proposition of Data Management products(Collibra, EDC, Databricks Unity Catalog).
Act as a technical liaison between business and developer teams, helping translate user and business needs into an understandable architecture for developers.
Drive processes, and tools necessary to ensure efficient / effective data governance, data management, business / data catalog development, data lineage, data management, data stewardship, data quality, analytics, machine learning, and business intelligence.
Lead and support technical teams in developing and deploying data products and/or production-level AI and Machine Learning solutions by leveraging modern open source or enterprise ready tools and technologies.
Identify opportunities and efficiencies in Data Engineering & Data Integrations with best of class tools and technologies and support implementation of technical patterns
Work with product leadership providing valuable product insight, guiding product direction and features.
Deliver architectural blueprints to support enablement of new integration patterns.
Collaborates with cross functional teams to understand their business, align IT and data capabilities to areas of the business that can provide value, and develop roadmaps to help them achieve their missions.
Understand and communicate solutions on how best to move work and workloads to more productive solutions and environments through digital transformations.
Requirements/Qualifications:
Minimum 10years of experience working in a data centric environment with strong understanding of enabling data capabilities and end to end data product lifecycle.
5-7 years of demonstrated track record and expertise in architecting technical resolutions that include Microservices, AI/ML platforms (Dataiku, Data Robot, R Studio, Databricks Lakehouse Platform), data management & pipelines (Spark, AirFlow, Synapse, ELK), continuous integration / continuous deployment (Jenkins, Ansible, Terraform), API Management (Apigee, Kong).
Strong understanding of Azure, Databricks and Data Warehouse concepts.
Experience with ServiceNow.
Experience with Agile methodology.