Design, develop, test and install high performance data intensive
applications using Hadoop based technologies including Apache Spark, HDFS,
Hive, Sqoop, Oozie, Impala, Java. Develop Spark applications using Apache
Spark, Hive and Java to process structured and unstructured data from various
data sources such as Oracle, HDFS. Develop Batch Processing applications using
Object Oriented programming languages such as Java, Spring Batch, Spring
Integration and Oracle. Build and deploy Java applications using Maven build
tool. Use Sonar for continuous inspection of code quality to perform automatic
reviews with static analysis of code to detect bugs, code smells, and security
vulnerabilities. Design and implement application security to allow secure access
to application resources. Develop
authorization models at application and database level to secure access to
database objects. Design and develop Data Warehouse applications using Oracle
and Dimension modelling techniques. Design Data Warehouse schemas using Star
design pattern. Design and develop ETL workflows to Extract, Transform and Load
data into Data Warehouse using data integration tool Informatica PowerCenter. Develop Complex SQL queries. Optimize and performance tune long running SQL
Queries. Develop Stored Procedures, Functions and Triggers to process, reformat
and store data using Oracle PL/SQL. Develop automation scripts using UNIX shell
scripting language and Oozie workflows to deploy batch jobs on distributed
platforms. Design and develop Java based
application using MVC (Model, View Controller) design pattern and Struts
framework. Develop middle tier business logic components using Java 2 Platform,
Enterprise Edition, Java Servlets and XML. Implement various design patterns like Business Delegate, DTO (Data
Transformation Object) and DAO (Data Access Objects) for front-end and back-end
systems in Oracle SQL and Java. Will work in Glastonbury, CT and/or
various client sites throughout the U.S. Must be willing to travel and/or
relocate.