Designs develop and test Data Engineering and BI solutions such as databases, data warehouses, queries and views, reports, and dashboards.
Performs data conversions, imports, and exports of data within and between internal and external software systems.
Merges BI platforms with enterprise systems and applications.
Enhances the performance of business intelligence tools by defining data to filter and index.
Documents new and existing models, solutions, and implementations.
Implement security measures by using AWS IAM, AWS VPC, Subnets, and Secret Managers.
Workflows, Glue Jobs, ETL automation, Data Migrations, Dashboard Manipulation, DevOps, AWS Services, Algorithmic Intelligence.
Team Lead for multiple client projects.
Provide Big Data and Data Science solutions using AWS (Amazon Web Services).
Proficient in Big Data, Data Lake, ETL, MapReduce, Data Warehouse, and BI reporting.
Established and maintained productive working.
Project Quality Review (Code reviews, design reviews, and hands-on).
Making sure to meet customer requirements and meet deadlines.
Work in Agile and Kanban model.
Languages: Python, and SQL.
Cloud Services: AWS RDS, AWS Lambda, EMR, EC2, S3, API Gateway, AWS Simple Storage Service (S3), AWS Glue Data Catalog, AWS Athena, AWS Redshift, AWS CodeCommit & CodePipeline, AWS VPC, AWS IAM, AWS Step Functions, AWS, EMR, AWS QuickSight, AWS DynamoDB, AWS Cloudformation, AWS Secret Manager, AWS CloudWatch, AWS Glue Workflows.
Packages: pandas, NumPy ,pyspark, Hadoop, SQLAlchemy
Research, design and implement scalable applications for information identification, extraction, analysis, retrieval, and indexing.
Work in Agile model with project managers, developers, quality assurance, and customers to resolve technical issues.
Business Analysis and Solutions proposals based on the required set of requirements.