Let’s get started
By clicking ‘Next’, I agree to the Terms of Service
and Privacy Policy
Jobs / Job page
Snowflake Technical Specialist image - Rise Careers
Job details

Snowflake Technical Specialist

Company Description

  • StarTekk’s adoption of Digital Transformation is to accelerate organizational growth, increase efficiencies and help Star Workforce achieve focused business goals. The Employee will help our organization to identify and bridge the gap that exists in the business by analyzing the data, scale and support digital transformation initiatives.

Job Description

Job Description
The contract manager will change the hybrid status based on the criticality, prioritization and project deadlines.
The Technical Specialist will be responsible for migrating the current data, framework and programs from the ODM EDW IOP Big data environment to the ODM EDW Snowflake environment. Technical Specialist will also involve in Medicaid Enterprise Data Warehouse design, development, implementation, migration, maintenance and operation activities. Works closely with Data Governance and Analytics team. Will be one of the key technical resources for ingesting the data to ODM EDW Snowflake environment and to build new or support existing Data warehouses and DataMart s for data analytics and exchange with State and Medicaid partners. This position is a member of Medicaid ITS and works closely with the Business Intelligence & Data Analytics team.

Responsibilities:
Participating Team activities, Design discussions, Stand up meetings and planning Review with team.
Provide Snowflake database technical support in developing reliable, efficient, and scalable solutions for various projects on Snowflake.
Ingest the existing data, framework and programs from ODM EDW IOP Big data environment to the ODM EDW Snowflake environment using the best practices.
Design and develop Snow park features in Python, understand the requirements and iterate.
Interface with the open-source community and contribute to Snowflake s open-source libraries including Snow park Python and the Snowflake Python Connector.
Create, monitor, and maintain role-based access controls, Virtual warehouses, Tasks, Snow pipe, Streams on Snowflake databases to support different use cases.
Performance tuning of Snowflake queries and procedures. Recommending and documenting the best practices of Snowflake.
Explore the new capabilities of Snowflake, perform POC and implement them based on business requirements.
Responsible for creating and maintaining the Snowflake technical documentation, ensuring compliance with data governance and security policies.
Implement Snowflake user /query log analysis, History capture, and user email alert configuration.
Enable data governance in Snowflake, including row/column-level data security using secure views and dynamic data masking features.
Perform data analysis, data profiling, data quality and data ingestion in various layers using big data/Hadoop/Hive/Impala queries, PySpark programs and UNIX shell scripts.
Follow the organization coding standard document, Create mappings, sessions and workflows as per the mapping specification document.
Perform Gap and impact analysis of ETL and IOP jobs for the new requirement and enhancements.
Create mockup data, perform Unit testing and capture the result sets against the jobs developed in lower environment.
Updating the production support Run book, Control M schedule document as per the production release.
Create and update design documents, provide detail description about workflows after every production release.
Continuously monitor the production data loads, fix the issues, update the tracker document with the issues, Identify the performance issues.
Performance tuning long running ETL/ELT jobs by creating partitions, enabling full load another standard approaches.
Perform Quality assurance check, Reconciliation post data loads and communicate to vendor for receiving fixed data.
Participation ETL/ELT code review and design re-usable frameworks.
Create Change requests, workplan, Test results, BCAB checklist documents for the code deployment to production environment and perform the code validation postemployment.
Work with Snowflake Admin, Hadoop Admin, ETL and SAS admin teams for code deployments and health checks.
Creature-usable framework for Audit Balance Control to capture Reconciliation, mapping parameters and variables, serves as single point of reference for workflows.
Create Snow park and PySpark programs to ingest historical and incremental data.
Create SQOOP scripts to ingest historical data from EDW oracle database to Hadoop IOP, created HIVE tables and Impala views creation scripts for Dimension tables.
Participation meetings to continuously upgrade the Functional and technical expertise.

REQUIRED Skill Sets:
Proficiency Data Warehousing, Data migration, and Snowflake is essential for this role.
Strong Experience in the implementation, execution, and maintenance of Data Integration technology solutions.
Minimum (4-6)years of hands-on experience with Cloud databases.
Minimum (2-3)years of hands-on data migration experience from the Big data environment to Snowflake environment.

Key Skills:
Data Warehousing, Data Migration, Snowflake, Cloud Databases, ETL/ELT, Python, PySpark, SQL, Hadoop, Hive, Impala, Performance Tuning, Data Governance, Security Policies, Role-Based Access Control, Data Profiling, Data Quality, Unix Shell Scripting, Code Deployment, Documentation, Troubleshooting, Change Management, Audit, Compliance

Qualifications

These duties are too complex and specialized to be performable with a bachelor’s degree related to computer science or computer information systems or information technology.

Additional Information

All your information will be kept confidential according to EEO guidelines.

 

Average salary estimate

$100000 / YEARLY (est.)
min
max
$80000K
$120000K

If an employer mentions a salary or salary range on their job, we display it as an "Employer Estimate". If a job has no salary data, Rise displays an estimate if available.

Similar Jobs
Photo of the Rise User
Posted 8 days ago
Photo of the Rise User
Posted 2 days ago
Photo of the Rise User
Posted 11 days ago
Photo of the Rise User
Posted 4 hours ago
Photo of the Rise User
City of Philadelphia Hybrid 1401 John F Kennedy Blvd, Philadelphia, PA 19102, USA
Posted 6 days ago
Photo of the Rise User
Anthropic Remote San Francisco, CA | New York City, NY | Seattle, WA
Posted 7 days ago
Inclusive & Diverse
Diversity of Opinions
Collaboration over Competition
Transparent & Candid
Passion for Exploration
Rapid Growth
Social Impact Driven
Mission Driven
Medical Insurance
Dental Insurance
Vision Insurance
Maternity Leave
Paternity Leave
Paid Time-Off
Equity
401K Matching
Commuter Benefits
Learning & Development
WFH Reimbursements
MATCH
VIEW MATCH
FUNDING
SENIORITY LEVEL REQUIREMENT
TEAM SIZE
No info
LOCATION
No info
EMPLOYMENT TYPE
Full-time, hybrid
DATE POSTED
March 9, 2025

Subscribe to Rise newsletter

Risa star 🔮 Hi, I'm Risa! Your AI
Career Copilot
Want to see a list of jobs tailored to
you, just ask me below!