Data Engineer · São Paulo, Brazil
Building scalable data pipelines, AI-ready data layers, and intelligent integrations. Currently at Skyone, turning raw data into Gold.
About
Junior Data Engineer at Skyone Solutions. I specialize in building ETL pipelines and Data LakeHouse architectures using Python, PySpark, SQL, Databricks, and Azure. Beyond traditional data engineering, I integrate AI agents and LLM APIs into data workflows to drive automation and intelligence. With a strong background in Linux server management and infrastructure, I build efficient, scalable data layers that bridge the gap between raw data and production-ready AI models.
Currently studying for the Databricks Data Engineer Associate certification and pursuing a degree in Systems Analysis & Development at FATEC-SP (2023–2027).
Experience
Skills
Projects
ETL pipeline ingesting live cryptocurrency market data from the CoinGecko API into a PostgreSQL database. Automated scheduling, data validation, and structured storage for analysis.
Mobile app combining real-time video analysis, AI-powered workout coaching, and social features. A personal side project exploring applied AI in fitness.
Guest on the Low Code podcast by Skyone. We talked about data engineering, low-code platforms, and where AI fits into all of it.