Country:United States of America
Location:MD233: 420 National Buisness Parkway, 420 National Business Parkway, Suite 400, Annapolis Junction, MD, 20701, United States of America
The Cloud Software Engineer develops, maintains, and enhances complex and diverse Big-Data Cloud systems based upon documented requirements. Directly contributes to all stages of back-end processing, analyzing, and indexing. Provides expertise in Cloud Computing, Hadoop Eco-System including implementing Java applications, Distributed Computing, Information Retrieval (IR), and Object Oriented Design. Works individually or as part of a team. Reviews and tests software components for adherence to the design requirements and documents test results. Resolves software problem reports. Utilizes software development and software design methodologies appropriate to the development environment. Provides specific input to the software components of system design to include hardware/software trade-offs, software reuse, use of Commercial Off-the-shelf (COTS)/Government Off-the-shelf (GOTS) in place of new development, and requirements analysis and synthesis from system level to individual software components. The Level 3 Cloud Software Engineer shall possess the following capabilities: Provide in-depth knowledge of Information Retrieval; assisting the software development team in designing, developing and testing Cloud Information Retrieval Implement complex workflows that manage Cloud MapReduce analytics Implement code that interacts with Cloud Distributed Coordination Frameworks Oversee one or more software development tasks and ensures the work is completed in accordance with the constraints of the software development process being used on any particular project Make recommendations for improving documentation and software development process standards Serve as a subject matter expert for Cloud Computing and corresponding technologies including Hadoop – assisting the software development team in designing, developing and testing Cloud Computing Systems Debug problems with Cloud based Distributed Computing Frameworks Manage multi-node Cloud based installation Delegate program.
Current TS/SCI with polygraph;
(12) years’ experience software engineering experience on programs and contracts of similar scope, type, and complexity. Bachelor’s degree in Computer Science or related discipline from an accredited college or university is required; (4) years of which must be in programs utilizing BigData cloud technologies and/or Distributed Computing.
•(2) years of Cloud and/or Distributed Computing Information Retrieval (IR).
•(1) year of experience with implementing code that interacts with implementation of Cloud BigTable.
•(1) year of experience with implementing code that interacts with implementation of Cloud Distributed File System.
(1) year of experience with implementing complex MapReduce analytics. •(1) year of experience with implementing code that interacts with Cloud Distributed Coordination Frameworks. •(1) year of experience in architecting Cloud Computing solutions
•(1) year of experience in debugging problems with Cloud based Distributed Computing frameworks
•(1) year of experience in managing multi-node Cloud based installation
•(5) years’ experience Utility Computing, Network Management, Virtualization (VMWare or VirtualBox), Cloud Computing
•(5) years’ expereince Multi Node Management and Installation: Management and installation of Cloud and Distributed Computing on multiple nodes, Python, CFEngine, Bash, Ruby or related technologies.
•(5) years’ experience Securing Cloud Based and Distributed applications through industry standard techniques such as Firewalls, PKI Certificate and Server Authentication with experience in Corporate authentication service(s)
•(5) years’ experience with Object Oriented Design and Programming, Java, Eclipse or similar development environment, MAVEN, RESTful web services.
•(5) years’ experience with Cloud and Distributed Computing Technologies: at least one or a combination of several of the following areas - YARN, J2EE, MapReduce, Zookeeper, HDFS,
•HBase , JMS, Concurrent Programming, MultiNode implementation/installation and other applicable technologies.
•(5) years’ experience with Cloud and Distributed Computing Information Retrieval: at least one or a combination of several of the following areas - HDFS, HBASE, Apache Lucene, Apache Solr, MongoDB 4.
•(5) years’ experience Ingesting, Parsing and Analysis of Disparate Data-sources and formats: XML, JSON, CSV, Binary Formats, Sequence or Map Files, Avro and related technologies
•(5) years’ experience with Aspect Oriented Design and Development 6. Debugging and Profiling Cloud and Distributed Installations: Java Virtual Machine (JVM) memory management, Profiling.
Experience with at least one SIGINT collection discipline areas (FORNSAT, CABLE, Terrestrial/Microwave, Overhead, and ELINT) •Geolocation, emitter identification, and signal applications.
•Joint program collection platforms and dataflow architectures; signals characterization analysis;
•Configuration management tools such as Subversion, ClearQuest, or Razor.
Four (4) years of cloud software engineering experience on projects with similar Big-Data systems may be substituted for a bachelor’s degree. Master's degree in Computer Science or related discipline from an accredited college or university may be substituted for (2) years of experience. Cloudera Certified Hadoop Developer certification may be substituted for (1) year of Cloud experience.
Not relocation eligible.
Employee Referral Award Eligibility: Only employees currently within RMD and RI&S have the potential to receive a Referral Award for submitting a referral to RMD and RI&S roles. ALL eligibility requirements must be met to receive the Referral Awarding.
Raytheon Technologies is An Equal Opportunity/Affirmative Action Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or veteran status, age or any other federally protected class.
Click on this link to read the Policy and Terms