Close CloseApply Apply

Job Opening

Job Title

Enterprise Architect 2



Job ID

6684 - KC

Location City


Location State




Open Date:


Hello there, we are seeking a Enterprise Architect 2 for our client located in Austin, TX.

Please review the details below.
If you have any questions or concerns, please reach out to our IT Recruiter - Kari Cooper at 512-543-9566 and/or

Title: Enterprise Architect 2
Duration: 8+ months Contract
Rate: $72/hr. W2
Location: Austin, TX

The Cloud Data Architect’s responsibilities may include:
Building technical solutions required for optimal ingestion, transformation, and loading of data from a wide variety of data sources using open source, AWS, Azure or GCP ‘big data’ frameworks and services.
Working with the product and software teams to provide feedback surrounding data-related technical issues and support for data infrastructure needs uncovered during customer engagements / testing.
Understanding and formulating processing pipelines of large, complex data sets that meet functional / non-functional business requirements.
Creating and maintaining optimal data pipeline architecture.
Working alongside Cloud Data Engineers, Cloud System Developers and Cloud Enablement Manager to implement Data Engineering solutions.
Redesigning and building a new data ecosystem on the cloud.
Extending on-premise data supply chain and modernizing data supply chain on the cloud.
Collaborating with the customer’s data scientists and data stewards during workshop sessions to uncover more detailed business requirements related to data engineering.
Building business cases to support MDM efforts and adoption.
Providing subject-matter-expertise in the assessment of solution proposals and concepts related to Master Data Management; Data Quality; Data Datalogging; and Metadata management and publication.
Facilitating cross-functional discussions working groups.
Identifying, defining, and communicating success factors.
Performing other duties as assigned.

Minimum Requirements:
Years Skills/Experien

8 Experience in building scalable end-to-end data ingestion and processing solutions
8 EIM Principles: architecture, sourcing, ETL, data modeling, pipelines and connectors, integration hubs, data access (SOA, API, SQL), platforms (traditional servers, cloud, hybrid), database types (traditional, proprietary MPP)
8 Excellent understanding of EIM principles, capabilities, and best practices with extensive experience and domain knowledge in the areas of Master Data Management, Reference Data, Data Quality, Metadata Management, and Data Governance capabilities
8 Good understanding of data infrastructure and distributed computing principles
5 Good understanding of data governance and how regulations can impact data storage and processing solutions such as HIPAA and FedRAMP
5 Ability to identify and select the right tools for a given problem, such as knowing when to use a relational or non-relational database
5 Working knowledge of non-relational and row/columnar based relational databases
5 Confidently taking responsibility for the technical output of a project. Ability to quickly pick up new skills and learn on the jo
5 Metadata Management experience including business definitions, business processing rules, data lineage
5 Master Data Management experience, with a good understanding of MDM full-lifecycle concepts/techniques
4 Understanding of database and analytical technologies in the industry including massive parallel processing and NoSQL databases, cloud data warehouse and data lake design, BI reporting and dashboard development
4 A successful history of manipulating, processing and extracting value from large disconnected datasets
4 Comfortably working with various stakeholders such as data scientists, architects and other developers
4 Previous experience managing Master Data Management and Data Governance deliverables of functional teams, such as Strategic plans/roadmaps, Frameworks, Technical Design Documents and/or Data Models
2 Experience with Machine Learning toolkits
2 Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala
2 Proficient at implementing data processing workflows using Hadoop and frameworks such as Spark and Flink
2 Delivering production scale data engineering solutions leveraging one or more cloud services
Strong Solid communication skills – able to clearly articulate the vision and confidently communicate with all stakeholder levels: Texas Health and Human Services, Customer, 3rd Parties and Partners - both verbal and written. Able to identify core messages and act quickly and appropriately.
Strong Strong technical writing skills

Years Skills/Experience

4 Experience with Healthcare or Insurance business domain
2 Informatica Experience