Cloud Architect.

Job Title: Cloud Architect.

Location: Chennai & Hyderabad

Experience: 10 to 12 Years

Must Skills: Aws/Azure/Gcp, SQL, Data Modelling & reporting Tools, & Hadoop.

Job Reference ID:BT/F4A/IND.


We are excited to welcome a talented and experienced Data Architect to our team. As a Cloud Architect, you will have the opportunity to work on a variety of projects and industries while working with a team of experienced professionals. You will be able to access cutting-edge tools and technologies and have ongoing learning and development opportunities to help you grow your skills. We are looking for someone who is passionate about data Engineering and is committed to staying up to date with the latest advancements in the field. If you meet our mandatory qualifications and possess some or all our preferred qualifications, we encourage you to apply and join our team.

Job Summary:
The Cloud Architect (Big Data) role will be accountable to deliver proof-of-concept projects, topical workshops, and lead implementation projects. These professional services engagements will focus on key customer solutions such as, web applications, enterprise applications, data warehouse migration, big data, archiving and disaster recovery, this role will focus on all aspects of data and information (both structured and unstructured) and support the development of enterprise systems from business requirements to logical architecture. This role spans the full information management life cycle from acquisition, cleansing, data modelling, transformation, and storage to presentation, distribution, security, privacy, and archiving. The role involves.

Job Responsibilities:

➢  Guiding project teams and data centric organizations on the best solutions to use enterprise architecture capabilities to deliver higher value at a faster pace to the business.

➢ Helping Project Managers in identifying key data, and information risks, mitigation plans, effort estimation and planning.

➢ Using domain expertise to influence future capabilities in architecture frameworks for Information architecture domain.

➢ Collaborating with Lead Architects across the other architectural domains – namely, Business, Application, Technology, and Security to ensure alignment of strategies.

➢ Developing standards, domain principles, and the best ways for creating and maintaining architecture artifacts (including inventories and models). This includes articulating the value of the artifacts.

➢ Hosting Peer Reviews to assess the status and compliance to architecture standards.

➢ Ensuring sufficiency of architecture requirements for top projects.

➢ Be accountable for the development of the conceptual, logical, and physical data models, the usage of RDBMS, operational data store (ODS), data marts, and data lakes on target Cloud platforms Azure/GCP/AWS PaaS (SQL/NoSQL).

➢  Be accountable for and govern the expansion of existing data architecture and the optimization of data query performance via the best solutions. The person should have the ability to work both independently and cross-functionally.



Mandatory Qualifications:

➢ Overall 10+ years of experience in IT industry

➢ Graduate (Any Degree) with 60% and above

➢ More than 5 years of demonstrated ability with normalized and Dimensional modelling techniques, Star & Snowflake schemas, modelling slowly changing dimensions, dimensional hierarchies, and data classification. Ideally at enterprise scale as well as at organizational level.

➢ 5+ years of experience with high scale /distributed RDBMS.

➢ Expertise in Data Quality, Data Profiling, Data Governance, Data Security, Metadata Management, MDM, Data Archival and Data Migration strategies using appropriate tools.

➢ Ability to define and govern data modelling and design standards, tools, the best approaches, and related development for enterprise data models.

➢ Hands-on data modelling in areas of Canonical, Semantic, Logical & Physical data models, design, schema synchronization and performance tuning. 

➢ Hands-on experience in:

➢ Data modelling tools (e.g., Erwin data modeler, ArchiMate, E/R studio, DB Schema etc) ETL (Extract-Transform-Load) tools (e.g., Informatica, Google Dataflow, Azure Data Factory, Talend, etc.) BI tools and reporting software (e.g., Tableau, PowerBI, MicroStrategy, etc.)

➢ Demonstrated ability to succeed in a fast paced and changing environment with interruptions, and multiple tasks/projects occurring simultaneously.

➢ Ability to work independently and have skills in planning, strategy, estimation, scheduling, and project management.

➢ Strong problem solving, influencing, communication, and presentation skills, self-starter.

➢ Ability to define reusable components, frameworks, common schemas, standards & tools to be used.

➢ Mentor and guide Technical Leads & Software Engineer (both internal and external).

➢ Cloud computing infrastructure (e.g., Amazon Web Services EC2, Elastic MapReduce) and considerations for scalable, distributed systems NoSQL platforms (e.g., key-value stores, graph databases) Data modelling techniques for NoSQL data and Cloud data platforms (e.g., AWS, Azure, GCP) High-scale or distributed cloud native data platforms.

➢ Experience in solution architecting, design & execute – Data lakes in CSPs (preferred Azure/GCP, AWS).

➢ AWS technologies: EC2, ECS, Lambda, S3, EBS, EFS, Redshift, RDS, DynamoDB, VPC, CloudWatch, CloudFormation, Cloud-Trail, OpsWorks, IAM, Directory Service, Ansible.

➢ MS Azure: Security Center, Azure Active Directory (Core, Developer, B2C, Services), Key Vault, understanding of securing PaaS solutions like SQL data warehouse. ADF, SQL DB, Azure App service etc.

➢ GCP: Google app stack and Google Big Data stacks.

➢ Hands on experience(preferred) or familiarity in any of the CD/CI tools – GitHub, Ansible Jenkins, spinnaker and Istio

➢  Experience in microservice based architecture or awareness is desirable.

➢ Experience in data domain modelling, data designing, tools – ArchiMate or Erwin, etc



Preferred Qualifications:

We are looking for someone who possesses strong analytical and problem-solving skills and has an ability to translate ambiguity and incomplete information into insights and impactful items understood by Business and IT leaders. They need to balance the need to operate the business today while understanding and influencing the business and data technologies of the future. The person requires curiosity to engage all business units and functions of the company while partnering with external organizations on the best approaches and solution delivery.


➢ SA (Associate) level certification in AWS or MS Azure or GCP

➢ Hands-on experience of the following:

➢ Hadoop stack (e.g., MapReduce, Sqoop, Pig, Hive, HBase, Flume) Analytical tools, languages, or libraries (e.g., TensorFlow, Spark, PySpark, KNIME, etc.) Related/complementary open-source software platforms and languages (e.g., Java, Python, Spark, Scala, Kubernetes, Docker, etc.)



Preferred Qualifications:

We are looking for someone who possesses strong analytical and problem-solving skills and has an ability to translate ambiguity and incomplete information into insights and impactful items understood by Business and IT leaders. They need to balance the need to operate the business today while understanding and influencing the business and data technologies of the future. The person requires curiosity to engage all business units and functions of the company while partnering with external 

organizations on the best approaches and solution delivery.


➢ SA (Associate) level certification in AWS or MS Azure or GCP

➢ Hands-on experience of the following:

➢ Hadoop stack (e.g., MapReduce, Sqoop, Pig, Hive, HBase, Flume) Analytical tools, languages, or libraries (e.g., TensorFlow, Spark, PySpark, KNIME, etc.) Related/complementary open-source software platforms and languages (e.g., Java, Python, Spark, Scala, Kubernetes, Docker, etc.)


Benefits:

➢ Competitive salary and benefits package.

➢ Opportunity to work on a variety of projects and industries.

➢ Opportunity to work with a team of experienced and talented professionals.

➢ Access to cutting-edge tools and technologies.

➢ Opportunity to grow and develop your skills through ongoing learning and development opportunities.


➢ Competitive salary and benefits package.

➢ Opportunity to work on a variety of projects and industries.

➢ Opportunity to work with a team of experienced and talented professionals.

➢ Access to cutting-edge tools and technologies.

➢ Opportunity to grow and develop your skills through ongoing learning and development opportunities.



Salary:  Commensurate with experience and demonstrated competence Contact: hr@bigtappanalytics.com

Contact:    hr@bigtappanalytics.com





Contact Informations:

Drop us a message