
Loading…

Loading…

Loading job…
Apex Systems [149270]
Location
Job type
Workplace
Duration
Posted
Compensation
Job Description Experience with GCP services such as Compute Engine, Data Proc, Kubernetes Engine, Cloud Storage, BigQuery, PUB/SUB, Cloud Functions and Dataflow.Cloud Composer, ETL experience - working with large data sets, PySpark, Python, Spark SQL, DataFrames, PyTestDevelop and implement proactive monitoring and alert mechanism for data issues.Familiarity with CI/CD pipelines and automation tools such as Jenkins, GitHub & GitHub actions.Able to write complex SQL queries for business results computationDevelop architecture recommendations based on GCP best practices and industry standards.Work through all stages of a data solution life cycle: analyze/profile data, create conceptual, logical & physical data model designs, architect and design ETL, reporting, and analytics solutions.Conduct technical reviews and ensure that GCP solutions meet functional and non-functional requirements.Strong knowledge of GCP architecture and design patterns.Required Skills : GCP and Pyspark Basic Qualification : Additional Skills : Background Check : Yes Drug Screen : Yes Notes :Selling points for candidate :Project Verification Info :Exclusive to Apex :NoFace to face interview required :NoCandidate must be local :NoCandidate must be authorized to work without sponsorship :YesInterview times set :YesType of project :Master Job Title :Branch Code :