Big Data Subject Matter Expert, Systems Engineer, Must Be A U.S. Citizen

Location: Dulles, VA
Date Posted: 09-06-2018
MUST BE A U.S. CITIZEN
W-2/1099 CONTRACT
ALL QUALIFIED RESUMES WILL RECEIVE FEEDBACK WITHIN 24 HOURS

Company is seeking a Big Data Subject Matter Expert (SME) who will work with multiple teams on varied domains in designing and developing highly scalable big data solutions. This expert will provide technical and development support to government and/or commercial clients to build and maintain modernized Enterprise Data Warehouses (EDW). This expert will do research and recommend the appropriate systems and tools to be used on a particular project based on the customer’s requirements and availability of resources, therefore, must demonstrate a deep aptitude and proficiency in programming and design techniques. This expert must demonstrate deep understanding of modern data analytics technologies such as Hadoop and Big Data Management tools, as well as, the hardware systems hosting big data solutions (to include modern data storage management) which help shape the design and development of custom datacenter solutions.

The SME understands technical architectures and current state of the market in several technology areas, hence, must possess and demonstrate deep knowledge of enterprise data warehouse (EDW) concepts, big data platforms and architectures and various design alternatives. Knowledge of general Data Management, mainly data lineage and metadata management is key because the SME is also responsible for devising data conversion strategies for a system focused on delivering advanced analytical capabilities. In conjunction with the Chief Engineer and/or Lead Systems Engineer, the SME will lead teams performing necessary data formatting and cleansing of legacy or new data. The SME will work with engineering leadership in the over-all systems architecture, scalability plans, reliability, and system performance. 

Required Skills: 

• Minimum of 8 or more years working as an active contributing member of a development team 
• Data Integration experience, including ability to design, document, develop and test data integration processes from data analyst specifications 
• Prior experience on analysis and resolution of data quality and integration issues 
• Demonstrable experience implementing Big Data solutions using current technologies 
• Strong knowledge of data structures, algorithms, enterprise systems, and asynchronous architectures. 
• Experience in data formatting, cleaning up the data, understanding of schemas and metadata 
• Practical skills with efficient file movement inside of big data platform 
• Experience in leading midsized developers teams in data warehouse / Hadoop integration 
• Experience with large scale ( >1TB raw) data processing, ETL and Stream processing
• Ability to design, architect and code (desirable) at an Enterprise, Commercial, and Best Practices standard
 
Desired Skills: • Demonstrable experience on Hadoop ecosystem (including HDFS, Spark, Sqoop, Flume, Hive, Impala, Map Reduce, Sentry, Navigator); Hadoop data ingestion using ETL tools (e.g. Nifi, Kafka Talend, Pentaho, Informatica) and Hadoop transformation (including MapReduce, Scala) • Excellent analytical abilities, consultative, communication, presentation and management skills • Ability to code in either Python or R for data analytics • Experience with big data deployments for cyber analytics • Experience with elastic stack (ELK) visualization using Kibana • Expert technical judgment and ability to work effectively with clients as well as IT management and staff • Experience on Continuous Integration • Demonstrated experience and knowledge of relational SQL databases such as SQL Server, Oracle and ability to write SQL commands with expert level SQL skills for data manipulation (DML) and validation (SQL Server, DB2, Oracle) • Experience working on UNIX / Linux environment, as well as Windows environment • Experience in creating Low Level Designs, and in providing and maintaining technical documentation, specifically Data Mapping and Low Level / ETL (or ELT) Design Documentation Required Education: • B.S./B.A. or higher in Engineering, Information Systems, Management Information Systems, Systems Engineering, Electrical Engineering, Computer Science, Computer Engineering, Computer Information Systems, Computer Systems Engineering, Mechanical Engineering, Physics, Math • Minimum of 8 relevant work experience
or
this job portal is powered by CATS