#Apidel is hiring Java Cloud Developers for one of the client.
*Need immediate or max 20 days joiners.
*Work from home opportunity.
Cv's can be shared on dhrumil.bhatiya@apideltech.com
Position - Java Cloud Developers
Skills - Java , AWS OR Cloud , Sql , Kubernetes, Docker, Kafka
Experience - 3 to 7 years
Location - Bangalore/Cochin (Work from home)
For any queries, you can directly reach me out at
91-6359878524.
#Java #Developers #Javadevelopers #AWS #Cloud #javacloudevelopers #Kafka #Spark #Kubernetes #Kubernates
*Need immediate or max 20 days joiners.
*Work from home opportunity.
Cv's can be shared on dhrumil.bhatiya@apideltech.com
Position - Java Cloud Developers
Skills - Java , AWS OR Cloud , Sql , Kubernetes, Docker, Kafka
Experience - 3 to 7 years
Location - Bangalore/Cochin (Work from home)
For any queries, you can directly reach me out at
91-6359878524.
#Java #Developers #Javadevelopers #AWS #Cloud #javacloudevelopers #Kafka #Spark #Kubernetes #Kubernates
We are #hiring at D Cube Analytics
Data Engineer & Associate Product Architect. Preference will be given to those who can join within 15 days.
Send #resume hr_india@d3analytics.com
If you are a #dataengineer and like to accept challenges then we are a right place for you. Come and join one of the fastest-growing Organization in biopharma domain.
Job Location: Bangalore (India), USA
Qualification: Bachelors / Masterβs in computer science
Experience: 3-8 years
SKILL SET
β #sqldatabase , #datamodelling Techniques & Data Lake projects, #etltools process, Performance optimization techniques is a must.
β Minimum of 2 years of experience in working with batch / real-time systems using various technologies like #databricks #redshift , #hadoop #apache #spark , #hive / #impala and #hdfs
β Minimum of 2 years of experience in #AWS or #azure or Google Cloud, Programming Languages (Preferably python)
β Experience in #biopharma Domain will be a very Big Plus.
Data Engineer & Associate Product Architect. Preference will be given to those who can join within 15 days.
Send #resume hr_india@d3analytics.com
If you are a #dataengineer and like to accept challenges then we are a right place for you. Come and join one of the fastest-growing Organization in biopharma domain.
Job Location: Bangalore (India), USA
Qualification: Bachelors / Masterβs in computer science
Experience: 3-8 years
SKILL SET
β #sqldatabase , #datamodelling Techniques & Data Lake projects, #etltools process, Performance optimization techniques is a must.
β Minimum of 2 years of experience in working with batch / real-time systems using various technologies like #databricks #redshift , #hadoop #apache #spark , #hive / #impala and #hdfs
β Minimum of 2 years of experience in #AWS or #azure or Google Cloud, Programming Languages (Preferably python)
β Experience in #biopharma Domain will be a very Big Plus.
Omnitracs is hiring for Staff QA Test engineers/ Sr SDET
Skills: #selenium # core java #API #RESTFUL #microservices #SQL#SPARK
Locations : #Bangalore
EXP : 6 to 15 years, NP < 30 days
Send cv to ragini.gupta@omnitracs.com
preferable to have Automated testing and data validation experience for high volume and large data sets
Β· Exposure to a variety of data processing systems, ideally both batch and
Skills: #selenium # core java #API #RESTFUL #microservices #SQL#SPARK
Locations : #Bangalore
EXP : 6 to 15 years, NP < 30 days
Send cv to ragini.gupta@omnitracs.com
preferable to have Automated testing and data validation experience for high volume and large data sets
Β· Exposure to a variety of data processing systems, ideally both batch and
Forwarded from Chennai Jobs & Careers (Siva Ganesan)
#spark ,#hadoop , #hive ,#pyspark
#spruceinfotech #spruceindia
Spruce InfoTech, Inc is looking for Hadoop developer
Job roll: Contract to hire
exp:4-6 Years
Location: Chennai, Hyd, Pune, Bangalore
NP:30 Days
JD:
4+ yrs of development experience in Hadoop/Spark/Pyspark
Development experience in Hadoop (Yarn, HDFS), Hive, Spark ,Kafka, Pyspark
Experience in any one of the programming language preferably Python
Interested candidates please share your cv atRadhika@spruceinfotech.com
#spruceinfotech #spruceindia
Spruce InfoTech, Inc is looking for Hadoop developer
Job roll: Contract to hire
exp:4-6 Years
Location: Chennai, Hyd, Pune, Bangalore
NP:30 Days
JD:
4+ yrs of development experience in Hadoop/Spark/Pyspark
Development experience in Hadoop (Yarn, HDFS), Hive, Spark ,Kafka, Pyspark
Experience in any one of the programming language preferably Python
Interested candidates please share your cv at