Career Profile

I am a PhD student in Computer Science working on traceability accross heterogeneous artifacts managed from diverse modeling platforms. I started off as a physics engineer but my real passion is in software development. My work experiences made me realise this and I finaly ended up studying a master in Software Engineering in the University of York in 2016. The academic program and life in York were so enjoyable that I decided to stay for a PhD. I am part of the Enterprise Systems research group of the Department of Computer Science and we are enabling and promoting Model Driven Engineering for large and complex software systems.

Experiences

Project Manager

2015 (5 months)
DevSpace/Embsoft, Merida, Mexico

Web Developer in Environmental Department [Community Service]

2015 (7 months)
Universidad Autonoma de Yucatan, Merida, Mexico

Java Web Application Developer [Internship]

2014 (6 months)
CECIMA/WinDesign, Aix-en-Provence, France

Environment Monitoring System Developer [Internship]

2013 (1 month)
CICY, Merida, Mexico

Projects

RestMule - A framework for handling various remote service policies, such as limited number of requests within a period of time and multi-page responses, by generating resilient clients that are able to handle request rate limits, network failures, response caching, and paging in a graceful and transparent manner.
Epsilon (Contributor) - A family of languages and tools for code generation, model-to-model transformation, model validation, comparison, migration and refactoring that work out of the box with EMF and other types of models.

Publications

2018 - RestMule: Enabling Resilient Clients for Remote APIs - 15th International Conference on Mining Software Repositories 2018 Mining data from remote repositories, such as GitHub and StackExchange, involves the execution of requests that can easily reach the limitations imposed by the respective APIs to shield their services from overload and abuse. Therefore, data mining clients are left alone to deal with such protective service policies which usually involves an extensive amount of manual implementation effort. In this work we present RestMule, a framework for handling various service policies, such as limited number of requests within a period of time and multi-page responses, by generating resilient clients that are able to handle request rate limits, network failures, response caching, and paging in a graceful and transparent manner. As a result, RestMule clients generated from OpenAPI specifications (i.e. standardized REST API descriptors), are suitable for intensive data-fetching scenarios. We evaluate our framework by reproducing an existing repository mining use case and comparing the results produced by employing a popular hand-written client and a RestMule client.
2016 - Hardening Clients for Remote APIs - (MSc in Software Engineering dissertation) Mining data from remote repositories, such as GitHub and StackExchange, involves the execution of requests that can easily reach the limitations imposed by the respective APIs to shield their services from overload and abuse. Therefore, data mining clients are left alone to deal with such protective service policies which usually involves an extensive amount of manual implementation effort. In this work we present RestMule, a framework for handling various service policies, such as limited number of requests within a period of time and multi-page responses, by generating resilient clients that are able to handle request rate limits, network failures, response caching, and paging in a graceful and transparent manner. As a result, RestMule clients generated from OpenAPI specifications, are suitable for intensive data-fetching scenarios.