Ktek Resourcing logo

Sr. Big Data Engineer

Ktek Resourcing
Full-time
On-site
Phoenix, Arizona, United States

Company Description

K-Tek’s core business is into temporary staffing, permanent placement and volume hiring. Since inception of our staffing solutions has grown multi-fold with global offices. We know what works best for our clients and what doesn’t. This is the key differentiator and this is how we edge over the competition.

Job Description

Hi,   

 

Please review and let me know if you are interested

Good Salary + Benefits + Relocation + travel expense – All provided.

 

ONLY FULLTIME

 

Locations: Phoenix, AZ, Washington, Atlanta   

 

Must Have’s- Java (Expert developer/engineer), Hadoop (expert level), strong Communication skills (involves interacting with clients).

 

Mode of interview: Online test, Phone round 1,2. And final interview.

 

Position 1:

Big Data Sr. Software Eng (10+ years)

Project: Development/Coding/Technical (muliple roles)

Must Have’s- Java (Expert developer/engineer), Hadoop (expert level), strong Communication skills (involves interacting with clients).

Overall 8-10 yrs of expereince

Atleast 6 years of Java expertise/development + coding

Must have minimum 3 yrs of Hadoop, Hive, Big Data exp

 

Description:

Very strong server-side Java experience, especially in an Open Source, data-intensive, distributed environments

Experience in the implementation role of high end software products in telecom/ financials/ healthcare/ hospitality domain

Should have worked on open source products and contribution towards it would be an added advantage

Implemented and in-depth knowledge of various Java, J2EE and EAI patterns

Implemented complex projects dealing with the considerable data size (GB/ PB) and with high complexity

Well aware of various architectural concepts (Multi-tenancy, SOA, SCA, etc.) and NFR’s (performance, scalability, monitoring, etc.)

Good understanding of algorithms, data structure, and performance optimization techniques

Knowledge of database principles, SQL, and experience working with large databases beyond just data access

Exposure to complete SDLC and PDLC

Capable of working as an individual contributor and within team too

Self-starter & resourceful personality with ability to manage pressure situations

Should have experience/ knowledge on working with batch processing/ real-time systems using various Open Source technologies like Solr, Hadoop, NoSQL DB’s, Storm, Kafka, etc.

 

ROLE & RESPONSIBILITIES :              

Implementation of various solutions arising out of the large data processing (GB’s/ PB’s) over various NoSQL, Hadoop, and MPP-based products

Active participation in the various Architecture and design calls with Big Data customers

Working with Sr. Architects and providing implementation details to offshore

Conducting sessions/ writing whitepapers/ Case Studies pertaining to Big Data

Responsible for timely and quality deliveries

Fulfill organization responsibilities – Sharing knowledge and experience within the other groups in the organization, conducting various technical sessions and trainings

 

 

Additional Information

All your information will be kept confidential according to EEO guidelines.