In this paper, I propose a model for writing stories by utilizing learned event representations to guide the construction of future events and, subsequently, sentences associated with the future events. While using event representations composed of tuples taking specific elements from a dependency parse would result in information being lost in translating between sentence and event representation, allowing the model to learn its own event representations guided by the existing event representation tuples allows for the retainment of information relevant to the production of subsequent sentences. The model beats the baseline results and the models from Martin et al. on perplexity for generating sentences, as well as on most of the top-5 accuracy scores. On human evaluation, my model produces significantly better output than Martin et al.’s model, with marginal improvement over a seq2seq model.