Data Services Engineer at HSBC Technology Canada (Toronto, ON, Canada)
Location: Toronto, ON, Canada
Type: Full Time
Created: 2021-04-22 05:00:49
Key Objectives for Data Services Engineer:
The Senior Data Mesh Engineer is responsible for contributing throughout the Technology delivery lifecycle across products and services across within a Data Mesh platform pod in HSS Digital and Data. Work as a Senior Engineer within the pod working across multiple deliveries across the data mesh.
This role will carry out some or all of the following activities:
Provide technical expertise within the Data Mesh cross sector delivery and run lifecycle, utilising their skills and expertise to carry out software development, testing and operational support activities typically being deployed to resolve the most challenging and/or impactful projects or deliveries
Build and Support the HSS Data Mesh services, identifying and developing the most appropriate Technology solutions to meet customer needs as part of the Customer Journey(s).
Contribute to building out key data frameworks and utlilities that Producer and/or Consumer teams across HSS can leverage – Data lineage, Data Ingestion, API, Data Streaming, GDPR, etc.
Must possess in-depth experience in GCP Data technologies (Data Flow, Google Cloud Bucket, BIgQuery, Airflow). Desireable experience on Hadoop technologies (HIVE, SPARK, HDFS, HBASE, OOZIE, RANGER, KMS, ZEPPELIN)
Pair with other pod engineers to understand and drive the product or service's direction.
Working with Ops, Dev and Test Engineers to ensure operational issues (performance, operator intervention, alerting, design defect related issues, etc.) are identified and addressed at all stages of a product or service release/change. Specific focus on API and Data platforms within a Data Mesh environment.
Contribute within the Engineering community within the pod, product area, HSS, GBM and HSBC Group helping to grow the skills/capabilities.
Have an “Automate first” mind-set showcasing best in class DevOps tooling and practices mind-set and to minimize variation and ensure predictable high quality code and data. Responsible for automating the continuous integration / continuous delivery pipeline within a DevOps Product/Service team driving a culture of continuous improvement. Extensively leveraging tools like Ansible, Jenkins, Terraform etc. to provide E2E automation.
Keep up-to-date and have expertise on current tools, technologies and areas like cyber security and regulations pertaining to aspects like data privacy, consent, data residency etc. that are applicable. Specific focus on Data technologies on Internal/External Cloud.
Identify as a strong technical resource within the team providing assistance to programmes or projects that require it.
- A Bachelor’s degree or equivalent experience with a major or minor in computer science or related field and a minimum of seven to ten years proven and progressive experience, including a minimum of three years experience in development of retail brokerage applications.
- Minimum of seven years of C# and .NET experience. Expertise on Visual Studio, GitHub, JIRA, Confluence, Microsoft SQL Server with SSIS packages. Experience in Oracle DB and Linux.
- Experience in Change Management processes.
- Experience in providing on-call production support, especially for retail brokerage applications.
- Specialized experience in programming or database languages and techniques for application systems design; recognition as a technical resource within the work group; strong communication, analytical, and leadership skills; the ability to take appropriate risk; a knowledge of businesses supported; and demonstrated ability to apply multiple technologies to business situations and identify and apply productivity improvements, blending the technical environment with strategic direction