This document discusses an efficient framework called ReComp for re-computing big data analytics processes when inputs or algorithms change. ReComp uses fine-grained process provenance and execution history to estimate the impact of changes and selectively re-execute only affected parts. This can provide significant time savings over fully re-running processes from scratch. The framework was tested on two case studies: genomic variant analysis (SVI tool) and simulation modeling, demonstrating savings of 28-37% compared to complete re-execution. ReComp provides a generic approach but allows customization for specific processes and change types.