This project investigates the usability of BERT pretrained models with few additional FC layers to get latent representation of sentences. We also explore the use of centroid clustering to summarize a long paragraph of review into 2-3 relevant and important sentences with an effective compression rate of around 80%.