🚨HIRING ALERT 🚨 🤵Job Title: Data Platform Architect / SME 👨💻Experience: 6+ Years 📍Location: Remote ⏳ Duration: Long-term 🛫 Visa Type: USC or GC We are seeking a skilled Data Platform Architect / SME with expertise in Airflow, AWS, and Snowflake to design, implement, and maintain data pipelines. The ideal candidate will have experience developing reusable scripts and workflows, optimizing data pipelines, and ensuring high availability and reliability. You will work with cross-functional teams to create scalable data models and architectures and stay current with emerging technologies.🚀 💎 Key Tech Stack: Airflow, Snowflake, DBT, Python, AWS, SQL, Jenkins, Terraform. 🌟Responsibilities: -Develop and maintain data pipelines using AWS, Airflow, and Snowflake. -Design scalable data models and architectures. -Create reusable scripts and workflows for data processing. -Optimize data pipelines for performance and cost-effectiveness. -Implement monitoring solutions and document best practices. -Design DAGs in Airflow and support DBT for data transformation. -Manage infrastructure on AWS and automate CI/CD deployments. 🌟Requirements: -Bachelor’s degree in Computer Science, Engineering, or related field, or equivalent experience. -6+ years of experience in data engineering with Airflow and Snowflake. -Proficiency in SQL, Python, and DevOps tools like Terraform and Jenkins. -Strong analytical and problem-solving skills. -Excellent communication and collaboration skills. Apply now on LinkedIn! Mail us at toufiq@pplability.com to Join us in building scalable and efficient data platforms 📤 #Pplability #DataEngineering #DataPlatform #Airflow #AWS #Snowflake #RemoteWork #DataPipelines #TechJobs #Hiring #SQL #Python #DevOps #Career #Growth #Jobopening #Applynow
Pplability INC’s Post
More Relevant Posts
-
#hiring #w2only Job Title: Sr. AWS Data Engineer - Fully Remote - PST Zone Email: rithik@foxprotech.com (or) DM Ritik Arora Required: SQL, PySpark and DataBricks Solid in AWS and AWS resources management like S3, Event Bridge, Glue, EMR, Redshift, Lambda Scripting languages Python and PySpark. At least 3 years’ experience Knowledge on Airflow and redshift connections Apache Airflow on Astronomer (good to have this experience) Knowledge on Terraform and GitHub management The majority of the work involves moving data between Salesforce and Redshift using Amazon resources and transformations and storage between that. #fullyremote #remote #remotejobs #remotehiring #hiringnow #dataengineer #dataengineerjobs #awsdataengineer #airflow #python #pyspark #terraform #apache #w2jobs #aws #s3 #glue #redshift #lambda
To view or add a comment, sign in
-
#hiring #w2only Job Title: Sr. AWS Data Engineer - Fully Remote - PST Zone Email: ajay@foxprotech.com (or) DM Ajay Required: SQL, PySpark and DataBricks Solid in AWS and AWS resources management like S3, Event Bridge, Glue, EMR, Redshift, Lambda Scripting languages Python and PySpark. At least 3 years’ experience Knowledge on Airflow and redshift connections Apache Airflow on Astronomer (good to have this experience) Knowledge on Terraform and GitHub management The majority of the work involves moving data between Salesforce and Redshift using Amazon resources and transformations and storage between that. #fullyremote #remote #remotejobs #remotehiring #hiringnow #dataengineer #dataengineerjobs #awsdataengineer #airflow #python #pyspark #terraform #apache #w2jobs #aws #s3 #glue #redshift #lambda
To view or add a comment, sign in
-
Data Engineer as a Service: Offered for a limited period only!* Due to an upcoming parental leave, I am looking for a short and/or part time assignment. If you want to - make a prestudy - review an existing prestudy - fill the gap during an FTE recruitment - increase the pace in an existing project - met an end of year deadline, or - just fix a report that has been wrong forever Perhaps a short assignment with a senior interim data engineer or data platform architect would be of interest? I have spent the last 4-5 years building cloud or hybrid data platforms, often rebuilding from onprem data warehouse solutions, using tools like airflow, dbt and snowflake on infrastructure like kubernetes and docker in AWS and Azure. Of course, if you want something built in Python and SQL, or have no idea what to build it but knows it is about data and want it delivered quickly, I can also help out. I am based in Stockholm. If you are not currently looking yourself, but know someone who might, feel free to repost or comment. *But at least until end of 2023, and probably limited part time during Q1-Q2 2024. #openforhire #freelance #dataengineer #datapipelines #python #dbt #snowflake #airflow #elt #k8s #azure #aws
To view or add a comment, sign in
-
🌟 Exploring New Data Engineering Opportunities 🌟 Greetings, LinkedIn Community! I hope this message finds you thriving and in good spirits. I'm Akhil P, a seasoned Data Engineer with a rich tapestry of over a decade's worth of experience in the field. My journey in the realm of data engineering has been marked by a deep dive into Data technologies, where I've navigated and conquered challenges with AWS, Azure, and GCP at the forefront. My forte lies in architecting robust data pipelines and orchestrating seamless data migrations to cloud environments. My specialization lies in AWS, which is backed by Solutions Architect Associate Certification. Across industries spanning entertainment, telecom, healthcare, and banking, I've had the privilege of leaving my mark through significant contributions. Currently, I'm looking for new opportunities in Data Engineering on a C2C basis. If your organization is looking for a dynamic professional to drive data initiatives forward, or if you simply have valuable insights to share, I'd love to connect and explore synergies. Feel free to reach out to me at avpenmatcha@gmail.com. Your time and consideration are greatly appreciated, and I eagerly anticipate engaging with this vibrant professional community. Warm regards, Akhil P #opentowork #opportunities #career #hiring #recruitment #AWS#azurecloud #dataengineer #dataanalyst #bigdataengineer#cloudengineer #python #sqlserver #jobsusa #c2crequirement #jobalert#usajobs #contractjobs #itrecruiters #recruiters #technicalrecruiter #c2c #c2cOppurtunities #c2cjobs #c2crequirements #corptocorp#corp2corp #oppurtunities #c2cvendor#c2cusajobs #c2cconsultant #c2croles #ITRecruiter #TechRecruiter#Hiring #TechJobs #JobOpportunities #ITJobs #Recruitment #TechTalent#ITIndustry #JobSeekers #TechCareer #ITNetworking #JobSearch#TechHiring #Recruiting #DataEngineer #BigData #ETL #DataProcessing#DataWarehousing #DataPipeline #ApacheSpark #Hadoop #Python#Scala #SQL #DataIntegration #DataModeling #DataArchitecture#StreamingAnalytics #BatchProcessing #DataWarehouse #ETLJobs#DataEngineeringJobs
To view or add a comment, sign in
-
We are Hiring a Data Engineer…… Key Skills: · 8+ years of experience as a Data Engineer in a cloud environment with Azure cloud · 5+ years of experience in Azure services such as Data Factory, Data Bricks, Data flows, Key Vaults, etc. Below skills needed: 1. ADF a. Pipelines b. Data flows c. Datasets d. Activities e. Triggers 2. SQL Server & Cosmos DB 3. Azure Integration Runtime 4. Azure Blob/ Data lake storage Nice to have: 1. Azure Key vault 2. Private network 3. Data warehouse design 4. Azure Synapse 5. R/Python scripts 6. Azure DevOps 7. Bicep Key Responsibilities: · Data engineers design, build, and optimize systems for data collection, storage, and access at scale. · Analyze and organize raw data. · Build data systems and pipelines. · Data modeling and analysis. · Evaluate business needs and objectives. · Interpret trends and patterns. · Conduct complex data analysis and report on results. · Prepare data for prescriptive and predictive modeling. · Combine raw information from different sources. · Explore ways to enhance data quality and reliability. · Authentication and Authorization in pipelines. · Key vault and secrets consumption. · Nice to have C#, WEB API · Nice to have Hybrid source connections. · Nice to have DevOps Work Experience: 8+ years #consign #consignspacesolutions #consignjob #consignjobs #wearehiring #hiringnow #dataengineer #azurecloud #datafactory #databricks #dataflows #keyvault #sqlserver #cosmosdb #azureintegration #blobstorage #datalake #datawarehousing #azuresynapse #rscripts #pythonscripts #azuredevops #bicep #datamodeling #dataanalysis #datapipelines #prescriptivemodeling #predictivemodeling #dataquality #datareliability #authentication #authorization #csharp #webapi #hybridconnections #devops #techjobs #jobopening #workexperience
To view or add a comment, sign in
-
-
We are #hiring 1. AWS Data SRE Engineer 2. AWS Data Engineer 3. Data Modeler If you are interested, please send your profile to sathyamoorthy.duraisamy@ltimindtree.com. Also please share your network #LTIMindtree #DeliveryImpact #SolvingwithData #dataandanalytics #WeareLTIMindtree #FutureFasterTogether Srikarthick JayaramanVikram JayaprakashRukshar KhatunSourakar Chaudhuri
To view or add a comment, sign in
-
#Hiring #AWS #DataEngineer #Remote Please email resumes in Word format with current Visa status to jobs@Salliance.us AWS Data Engineer (8+ years exp) Job Summary 8+ years’ experience in Python with any RDBMS and API. Writing effective, scalable code and testing code in Python Experience automating code testing and pipelines Snowflake: Snowpark, procedures, tasks, views. SQL: Query optimization, windows functions, CRUD operationsData driven Python: Pyspark, DataFrames, ETL, Unit testing, Rest API AWS : RDS , DynamoDB , KInesis , S3, Cloudwatch DevOps: GIT, pipeline deployment like: Azure DevOps pipelines, Jenkins, GitLab, CircleCi, etc.
To view or add a comment, sign in
-
Hello #folks Having an urgent requirement of AWS Data Engineer with one of our prime Client. Role-Sr. AWS Data Engineer Location-Houston TX(Onsite) Contract Job Description · 9+ years of strong hands-on experience working in data warehousing, data engineering and dimension modelling. · Should be able to work independently with minimal guidance, with excellent problem solving and analytical skills. Required Skills · Experience building and maintaining ETL pipelines with large data sets using services such as AWS Glue, EMR, Kinesis or Kafka · Strong Python development experience with proficiency in Spark or Pyspark and in using APIs · Strong in writing SQL queries and performance tuning in AWS Redshift and other industry leading RDMS such MS SQL Server, Postgres · Proficient working with AWS Services such as AWS Lambda, Event Bridge, Step functions, SNS, SQS · Familiar with how IAM Roles and Policies work Preferred Skills · Worked in workflow management tools such as Airflow · Familiar with infrastructure coding such as Cloud Formation · Worked in CI/CD pipeline and agile methodologies. Do let me know if you are avilable and intrested or help this post reach out to someone who could be a great fit for this Opportunity. I am avilable on 609-897-9670 Ext. 2216 or BishnuK@sysmind.com #awsdataengineering #datascience #bigdata #dataengineer #dataanalytics #bigdataanalytics #data #python #pythonprogramming #dataanalysis #datavisualization #businessintelligence #datawarehouse #sql #datasciencetraining #bi #bigdataanalysis #technology #datascientist #datamanagement #programminglife #pythonlearning
To view or add a comment, sign in
-
Sr Cloud Data Engineer| Python | SQL | AWS | PySpark | Kafka | Hadoop | Data Warehousing | Looking for Contract roles
🔍 Pavithra Masilamani - Data Engineer Seeking New Opportunities** 📢 Hello LinkedIn community! I hope this post finds you well. 👋 I'm Pavithra, an experienced Data Engineer actively seeking new challenges and opportunities in the data space. 🔧 **Skills:** - ETL, SQL, Python, AWS, Big Data technologies, Data Warehousing 👨💻 **Experience:** • Built Snowflake stage over Azure blob and later mounted Snowflake External table using the stage. Proficient in table design strategy using Data clustering keys. • Migrated an existing on-premises application to AWS. Used AWS services like EC2 and S3 for small data sets processing and storage, experienced in Maintaining the Hadoop cluster on AWS EMR • Strong experience and knowledge of real-time data analytics using Spark Streaming, Kafka, and Flume 🚀 **Why Hire Me:** I am passionate about leveraging data to drive insights and improvements. My hands-on experience in [mention any specific technologies or industries you have expertise in] has equipped me with the skills needed to tackle complex data challenges. 🌐 **Open to Collaborations:** I'm open to discussing opportunities related to data engineering, data analytics, or any role where I can contribute my expertise. If you or someone you know is looking for a dedicated and results-driven Data Engineer, please don't hesitate to reach out or share this post. Let's connect and explore how my skills can bring value to your team! #DataEngineer #DataEngineering #JobSeeker #DataAnalytics #TechJobs #c2cvendors #c2crequirements #contractjobs #Snowflake #plateform #Hadoop #SAS #MySQL #MongoDB Thank you for your time and support! 🙌
To view or add a comment, sign in