SDLC

  • Data Architect

    Posted Date 1 week ago(10/10/2018 1:45 PM)
    Job ID
    2018-1589
    # of Openings
    1
    Location
    Pittsburgh
    Category
    Business Consulting
  • Overview

    SDLC Partners is a dynamic and fast-paced, privately held consulting firm with 400+ employees.  We deliver customized digital solutions to transform organizations through our uniquely enabled talent, processes, and leadership.  Through full scope partner, strategic, and improvement solutions, we give clients access to some of the best talent available across our service lines.  We hire individuals who embody our goal to enable performance for our clients and we believe strongly in growing and developing talent.

    Responsibilities

    The Data Architect is responsible for data development, reporting, analytics and business intelligence support solutions to meet the information needs and the strategic goals of the organization. 

    • Develop target data architecture that enables intended business architecture and architecture vision, while addressing request for architecture work and stakeholder concerns. Identify candidate architecture roadmap components based upon gaps between baseline and target data architectures
    • Provide overall direction, oversight, and successful delivery of data design and implementation
    • Apply industry standards for development of data architecture, design principles, and quality control
    • Collaborate with Architecture & Governance leadership to develop and evolve and data standards for the organization

     

    Qualifications

    • 10+ years of professional experience in IT or consulting with focus on software development
    • Bachelor’s or Graduate degree in Computer Science, Information Systems, a related degree, or equivalent experience
    • Hands-on experience with Hadoop, MapReduce, Hive, Sqoop, Splunk, STORM, SPARK, Kafka and HBASE (At least 3 years)
    • Experience with end-to-end solution architecture for data capabilities including:
    • ELT/ETL development, patterns and tooling (Informatica, Talend)
    • BI tools (BusinessObjects, Tableau) and other visualization 
    • Advanced Analytics (SAS, Python, R)
    • Test Driven Code Development and SCM tools
    • Fluent understanding of best practices for building Data Lake and analytical architectures on Hadoop
    • Strong scripting / programming background (Unix, Python preferred)
    • Strong SQL experience with the ability to develop, tune, and debug complex SQL applications
    • Expertise in schema design, developing data models, and proven ability to work with complex data is required
    • Experience in real time and batch data ingestion
    • Proven experience in working in large environments such as RDBMS, EDW, NoSQL, etc.
    • Understanding of security, encryption and masking using various technologies including Kerberos, MapR-tickets, Vormetric and Voltage
    • Familiarity with Hortonworks distribution of Hadoop is preferred

     

    Options

    Sorry the Share function is not working properly at this moment. Please refresh the page and try again later.
    Share on your newsfeed