We are hiring Arabic English OPIVRI Interpreters - Cairo, مصر - Future Group Translation Services
وصف
This is a remote
The Sr Azure Databricks Architect at Koantek builds secure, highly scalable big data solutions to achieve tangible, data-driven outcomes all the while keeping simplicity and operational effectiveness in The architect is expected to evangelize and educate others on the engineering design and development He/She is expected to function as a productive member of a team, working and communicating proactively with engineering peers, technical lead, project managers, product owners, and resource managers.
Requirements
Specific requirements for the role include:
Expert-level knowledge of data frameworks, data lakes and open-source projects such as Apache Spark, MLflow, and Delta Lake
Must have data migration experience from on prem to cloud
In depth understanding of Azure cloud and Data lake and Analytics solutions on
Hands on development Design and Develop applications using Azure ADF, Databricks, SSIS, Airflow
In depth understanding of Spark Architecture including Spark Core, Spark SQL, Data Frames, Spark Streaming, RDD caching, Spark MLib
Knowledge of different programming and scripting languages
Mid-level knowledge of code versioning tools [such as Git, Bitbucket or SVN]
Expert proficiency in Python, C++, Java, R, and SQL
Good to have programming language experience with .NET or Spark/Scala
Expertise in using Spark SQL with various data sources like JSON, Parquet and Key Value Pair
Broad experience in Microsoft SQL technologies including SSAS Tabular models, DAX, T-SQL, Service Broker, Replication, and Performance Tuning
Broad multi-tenant data architecture and implementation experience across different data stores ( , Azure Data Lake GEN2, Azure SQL Data Warehouse, Azure Blob Storage, HDFS) , messaging systems ( , Azure Event Hubs, Apache Kafka), and data processing engines ( , Azure Data Lake Analytics, Apache Hadoop, Apache Spark, Apache Storm, Azure HDInsight)
Experience using Azure DevOps and CI/CD as well as Agile tools and processes including Git, Jenkins, Jira, and Confluence
Experience in creating tables, partitioning, bucketing, loading and aggregating data using Spark SQL/Scala
Able to build ingestion to ADLS and enable BI layer for Analytics
Experience in Machine Learning Studio, Stream Analytics, Event/IoT Hubs, and Cosmos
Knowledge of SOA and Micros Services Application Architecture
Implementation of security protocols
Nice to have Machine learning and Artificial Intelligence
Strong understanding of Data Modeling and defining conceptual logical and physical data
Proven track record of building deep sales/technical relationships with CXO?s and practice building executives within highly strategic system
Experience in managing various client relationships to get consensus on
Big Data/analytics/information analysis/database management in the cloud
IoT/event-driven/microservices in the cloud- Experience with private and public cloud architectures, pros/cons, and migration
Proficient level experience with architecture design, build and optimization of big data collection, ingestion, storage, processing, and visualization
Proficient in building, automating, and deploying data pipelines and workflows into end-user facing applications
Ability to remain up to date with industry standards and technological advancements that will enhance data quality and reliability to advance strategic initiatives
Basic experience with or knowledge of agile methodologies
Technical expertise with data models, data mining, and segmentation techniques
Working knowledge of RESTful APIs, OAuth2 authorization framework and security best practices for API Gateways
Expert at diagnostic and problem resolution providing third-level support
Familiarity of working with unstructured data sets (i. voice, image, log files, social media posts, email)
Possess an organized methodical approach and bring a continuous improvement mindset
Highly creative and innovative technologist that thrives independently and collaborates well in a team environment
Responsibilities:
· Work closely with team members to lead and drive enterprise solutions, advising on key decision points on trade-offs, best practices, and risk mitigation
· Educate clients on Cloud technologies and influence direction of the
· Design and architect solutions on Microsoft Azure and related technologies including ADF, ADLS, ASA, AAS, AML and Power BI
· Manage data related requests, analyze issues, and provide efficient re Design all program specifications and perform required tests
· Design and Develop data Ingestion using ADF and processing layer using
· Work with the SMEs to implement data strategies and build data
· Prepare codes for all modules according to required
· Monitor all production issues and inquiries and provide efficient re
· Evaluate all functional requirements, map documents, and troubleshoot all development processes
· Document all technical specifications and associates project
· Design all test cases to provide support to all systems and perform unit
Qualifications:
2+ years of hands-on experience designing and implementing multi-tenant solutions using Azure Databricks for data governance, data pipelines for near real-time data warehouse, and machine learning
3+ years of design and development experience with scalable and cost-effective Microsoft Azure data architecture and related solutions (, Azure Data Lake Gen2, Azure Data Factory, CDM, Event Hub, Azure Databricks, Synapse, Azure SQL DW)
5+ years experience in a software development, data engineering, or data analytics field using Python, Scala, Spark, Java, or equivalent technologies with overall experience of years in the IT
Bachelor s or Master s degree in Big Data, Computer Science, Engineering, Mathematics, or similar area of study or equivalent work experience
Strong written and verbal communication skills
Ability to manage competing priorities in a fast-paced environment
Ability to resolve issues
Self-Motivated and ability to work independently
Nice to have-
- Advanced technical certifications: Azure Solutions Architect Expert, DASCA Big Data Engineering and Analytics
- Project Management experience
Hands-on experience on AWS(required), Azure or GCP & good to have Databricks experienc Must have hands-on proficiency in Python, SQL, Shell script SparkSQL, NoSQL, PySpark is a plus Advanced development expertise in Python , including the ability to develop back-end components