Back

Cloud Software Engineer

Jacobs

Location Pin Icon
Hanover, MD
LOCATION

Job Description

As posted by the hiring company

Job Overview:

Candidate will implement the inference infrastructure to do a pilot testing the feasibility and scalability of a new machine learning algorithm to identify the author of small to large portions of text. The infrastructure to do the training has already been implemented. The candidate will build the infrastructure to coalesce the appropriate data, apply the model to the collected data, and store the results into an Accumulo table that will be queried by the end user system. The candidate will work closely with the researchers who developed the algorithm and the developers of the end user system. If time permits the candidate might be required to accelerate the training and inference of the machine learning algorithms using GPU accelerated machine learning APIs.

This position requires front end development, developing with Javascript and leveraging commercial libraries.  Additionally the position involves developing REST web services in Scala, interfacing to Accumulo and MongoDB databases.

  • A bachelor's degree in computer science or related discipline; an associate's degree in computer science or related discipline and two (2) years programming experience; or five (5) years programming experience may be substituted for a bachelor's degree
  • Within the last ten (10) years, a minimum of seven (7) years of experience combined programming with all of the following languages: Java, C, and C++
  • Within the last seven (7) years, a minimum of five (5) years of experience designing and developing applications in a Windows or UNIX/Linux operating environment
  • Within the last five (5) years, a minimum of three (3) years of experience with OpenSource (NoSQL) products that support highly distributed, massively parallel computation needs such as Hbase, Apache Accumulo, and/or Big Table
  • Within the last three (3) years, a minimum of one (1) year experience with Map/Reduce
  • Within the last three (3) years, a minimum of one (1) year experience with the Hadoop Distributed File System (HDFS)
  • Within the last three (3) years, a minimum of one (1) year experience with requirements analysis and design for one (1) or more object-oriented systems
  • Within the last three (3) years, a minimum of one (1) year experience developing Restful services
  • Demonstrated experience developing analytics within the GHOSTMACHINE framework.
  • Demonstrated experience with Java.
  • Demonstrated experience with Git repositories.
  • #cjcyber